'People will forgive you for being wrong, but they will never forgive you for being right - especially if events prove you right while proving them wrong.' Thomas Sowell
Search This Blog
Friday, 12 July 2024
Friday, 16 June 2023
Fallacies of Capitalism 1: Inevitability of Inequality
How does the 'inevitability of inequality' fallacy ignore the role of social and institutional factors in perpetuating the unequal distribution of wealth and opportunities in a capitalist system?
The "inevitability of inequality" fallacy suggests that inequality is a natural and unavoidable outcome of a capitalist system, implying that it is inherently fair and just. However, this fallacy ignores the significant role of social and institutional factors that contribute to the unequal distribution of wealth and opportunities. Let me break it down with some simple examples:
Unequal starting points: In a capitalist system, individuals have different starting points due to factors like family wealth, education, and social connections. These disparities make it harder for those with fewer resources to compete on an equal footing. For instance, imagine two children who want to become doctors. One child comes from a wealthy family with access to the best schools and tutors, while the other child comes from a low-income family and attends underfunded schools. The unequal starting points put the second child at a significant disadvantage, limiting their opportunities for success.
Discrimination and bias: Social factors such as discrimination based on race, gender, or socioeconomic status can perpetuate inequality. Discrimination may lead to unequal treatment in hiring practices, education, or access to resources. For example, imagine a qualified job applicant who is denied a position because of their gender or ethnicity, while a less qualified candidate from a privileged background is chosen. Discrimination hinders individuals' ability to succeed and reinforces inequality in society.
Power imbalances: Capitalist systems often concentrate power and wealth in the hands of a few individuals or corporations. These powerful entities can influence policies, regulations, and institutions to their advantage, further perpetuating inequality. For instance, consider a large corporation that has significant political influence. They may lobby for policies that favour their interests, such as tax breaks or deregulation, while undermining measures that could reduce inequality, such as progressive taxation or workers' rights.
Lack of social mobility: Inequality can persist if social and institutional factors make it difficult for individuals to move up the social ladder. For example, imagine a society where access to quality education is primarily determined by wealth. If children from low-income families are unable to receive a good education, it becomes challenging for them to break the cycle of poverty and improve their economic prospects. This lack of social mobility reinforces existing inequalities over generations.
These examples demonstrate that the "inevitability of inequality" fallacy overlooks the social and institutional factors that contribute to the unequal distribution of wealth and opportunities in a capitalist system. By recognising these factors and working towards creating a more equitable society, we can address and reduce the systemic barriers that perpetuate inequality.
Sunday, 19 June 2022
Wednesday, 9 February 2022
Friday, 7 January 2022
Thursday, 18 February 2021
Why economists kept getting the policies wrong
Philip Stephens in The FT
The other week I caught sight of a headline declaring that the IMF was warning against cuts in public spending and borrowing. The report stopped me in my tracks. After half a century or so as keeper of the sacred flame of fiscal prudence, the IMF was telling policymakers in rich industrial nations they should not fret overmuch about huge build-ups of public debt during the Covid-19 crisis. John Maynard Keynes had been disinterred, and the world turned upside down.
To be clear, there is nothing irresponsible about the IMF’s advice that policymakers in advanced economies should prioritise a restoration of growth after the deflationary shock of the pandemic. The fund prefaced a shift last year, and most people would say it was common sense to allow economic recovery to take hold. Nations such as Britain might have learned that lesson from the damage inflicted by the ill-judged austerity programme imposed by David Cameron’s government after the 2008 financial crash.
And yet. This was the IMF speaking — the hallowed (for some, hated) institution that, as many Brits will recall, formally read the rites over Keynesianism when in 1976 it forced James Callaghan’s Labour government to impose politically calamitous cuts in spending and borrowing. This is the organisation that in the intervening years had a few simple answers to any economic problem you care to think of: fiscal retrenchment, a smaller state and/or market liberalisation. The advice was heralded as the Washington consensus because of the IMF’s location.
My first job after joining the Financial Times during the early 1980s was to learn the language of the new economic orthodoxy. Kindly officials at the UK Treasury explained to me that the technique of using fiscal policy to manage demand, put to rest in 1976, had been replaced by a new theory. Monetarism decreed that as long as the authorities kept control of the money supply, and thus inflation, everything would be fine.
The snag was that every time the Treasury alighted on a particular measure of the money supply to target — sterling M3, PSL2, and M0 come in mind — it ceased to be a reliable guide to price changes. Goodhart’s law, this was called, after the eponymous economist Charles. By the end of the 1980s, monetarism had been ditched, and targeting the exchange rate had become the holy grail. If sterling’s rate was fixed against the Deutschmark, the UK would import stability from Germany.
It was about this time that a senior aide to the chancellor took me to one side to explain that one of the great skills of the Treasury was to perform perfect U-turns while persuading the world it had deviated not a jot from previous policy. This proved its worth again when the exchange rate policy was blown up by sterling’s ejection from the European exchange rate mechanism in 1992. The currency was quickly replaced by an inflation target as an infallible lodestar of policy.
The eternal truths amid the missteps and swerves were that public spending and borrowing were bad, tax cuts were good, and market liberalisation was the route to sunlit uplands. The pound’s ERM debacle was followed by a ferocious budgetary squeeze, and, across the channel, the eurozone was designed to fit a fiscal straitjacket. Financial market deregulation, we were told, oiled the wheels of globalisation. If madcap profits and bonuses at big financial institutions prompted unease, the answer was that markets would self-correct. Britain’s Labour government backed “light-touch” regulation in the 2000s. The Bank of England reduced its oversight of systemic financial stability.
The abiding sin threaded through it all was that of certitude. Perfectly plausible but untested theories, whether about the money supply, fiscal balances and debt levels, or market risk, were elevated to the level of irrefutable facts. Economics, essentially a faith-based discipline, represented itself as a hard science. The real world was reduced by the 1990s to a set of complex mathematical equations that no one, least of all democratically elected politicians, dared challenge.
Thus detached from reality, economic policy swept away the postwar balance between the interests of society and markets. Arid econometrics replaced a measured understanding of political economy. It scarcely mattered that the gains of globalisation were scooped up by the super-rich, that markets became casinos and that fiscal fundamentalism was widening social divisions. Nothing counted above the equations. And now? After Donald Trump, Brexit and Covid-19, it seems we are back at the beginning. Time to dust off Keynes’s general theory.
Sunday, 22 October 2017
Oxbridge bashing is an empty ritual if we ignore wider social inequities
The numbers are clearly unacceptable. Several colleges in both Oxford and Cambridge frequently admit cohorts with no black students in them at all. Roughly 1.5% of total offers are made to black British applicants and more than 80% of offers are made to the children of the top two social classes. With offers made overwhelmingly to those in London and a handful of the home counties, both universities are consistently excluding entire ethnic and regional demographics. They also continue to admit a grotesquely disproportionate number of privately schooled students. In effect, the two ancients are running a generous quota scheme for white students, independent schools and the offspring of affluent south-eastern English parents.
There is undoubtedly a great deal that both institutions can and must do to remedy this. Our admissions processes at Cambridge are not sufficiently responsive to the gravity of the situation. Despite periodic panics in response to such media “revelations” or staged political scolding, and notwithstanding the good intentions of many involved in admissions, questions of diversity and inclusion are not taken seriously enough in their own right.
The focus on educational achievement, itself defined in purely numerical terms and worsened by internal league tables, means there is little sense of meaningful diversity as an educational and community good in its own right. Despite having contextual indicators that would allow us to diversify our admissions, we balk at non-traditional attainment profiles for fear that the student will not be able to cope once here.
For any Oxbridge college to not have a single black student at any given point in time, where they would rightly not tolerate having low numbers of women, is not just about looking institutionally racist but also impoverishes the educational and social environment we provide. The same holds true for regional and class exclusions.
When I first came to Cambridge in 2001, having taught at different institutions in the US, I was struck by the relative whiteness and sheer cultural homogeneity of this university. Even the minimal improvements I’ve seen since then in some years – more students from ethnic minority backgrounds, more young women from northern comprehensives – have made a huge difference both to me as a teacher and, more importantly, to what students are able to learn from each other.
Not all of them will get first-class marks, but they both gain a lot from and have a great deal to give to the educational environment here, not least by expanding the definition of what counts as achievement. We need more of them. (At Cambridge, in recent years, a quantum of vocal BME students as well as students from northern comprehensives has demanded change, often to good effect. There is some cause for hope.)
There is also undoubtedly a culture of denial when it comes to matters of race and racism, which students speak of both in class and privately and which I have experienced when I’ve tried to draw attention to them. And more than one student from northern comprehensives has told me about being discouraged by teachers from applying and feeling amazed to have received an offer only to feel alienated by the stultifying class conformity of the affluent south-east once they get here.
It is simply not good enough for Oxford and Cambridge to say that they are welcoming of diversity and in effect blame certain demographics for not applying despite their outreach programmes. It is Oxbridge that must change more substantially to provide a better environment for a diverse student body. The two ancients must be held to account; homogeneity must fall.
But should they be the only ones held to account? In having a necessary conversation about elitism and exclusion, are we forgetting – or being encouraged – to not have a larger one about wider deprivation and systemic inequality? It is striking that some quarters only too happy to periodically attack Oxbridge for its failings, from rightwing tabloids to Tory ministers, are rarely interested in the roots of inequality and lack of opportunity of which Oxbridge exclusion is a symptom but is hardly the origin.
We should be careful that a headline-friendly focus on these two institutions alone does not become an easy way to avoid even more painful and challenging questions. It seems somewhat selective and inadequate to focus on what David Lammy rightly calls “social apartheid” at Oxbridge without discussing the widespread and worsening economic apartheid in this country.
We know that access to university education in general is sharply determined by school achievement that, in turn, is shaped by parental income and education levels. In an economically stratified society, it is inevitable that most young people from economically deprived backgrounds have a substantially lower chance of achieving the kind of marks that enable access to higher education.
Hence it is incoherent to have a discussion about access to higher education without having one simultaneously about economic disadvantage, which, in some cases, including British Caribbean and Bangladeshi communities, has an added ethnic minority dimension to it. In a context of worsening economic fault lines, there’s a whiff of something convenient about only attacking the admissions failings of top universities.
The other obvious missing dimension to this discussion is the existence and encouragement for independent schools. It’s somewhat contradictory to encourage a market culture where money can buy a deluxe education and then feel shocked when the well-off get their money’s worth by easily meeting the requirements for offers from high-status institutions. It’s worth saying that as long as independent schools, hardly bastions of ethnic diversity, exist, there will remain a fundamental apartheid between two kinds of students.
Oxbridge, or even the Russell Group of universities more broadly, can only do so much to mitigate this state of affairs, which lifting the tuition fee cap will only worsen. Lammy notes that more offers are made to Eton than to students on free school meals.
But why not also question the very existence of Eton and the lamentable state of an economic order that necessitates free school meals for many? Add to this the parlous condition of state education with its chronic underfunding, inflated classroom sizes, an undermining testing and target culture and difficulties in recruiting and retaining good teachers.
The same politicians who rightly point to Oxbridge’s demographic narrowness are rarely willing to grasp the nettle of a two-tier educational structure in which some are destined to do much better than others. Who, for instance, would be willing to call for the abolition of private schooling, subject as such a suggestion would be to shrill denunciations about how individual choice, personal aspiration and the workings of the market are being interfered with?
There are other tough discussions that could be had if the aim truly is to address and undo inequalities in university demographics. Would politicians and institutions be willing, for instance, to impose representational quotas for both ethnic minorities and state-educated students that reflect the national pie-chart?
Currently, the Office for Fair Access (Offa) makes some toothless demands around “widening participation”, a rather feeble phrase, which are not accompanied by penalties for failure. Lammy, whose suggestion that admissions be centralised has some merit to it, not least towards undoing the unhelpful internal collegiate caste system at Oxbridge, has made also a comparison between Oxbridge’s abysmal intake of black students and Harvard’s healthy numbers.
Would the political and intellectual classes be willing to have a discussion about something like “affirmative action” in the US, a process of “positive discrimination” by which underrepresented ethnic minorities and disadvantaged groups are given special consideration? We must hope so. For failing a wide-ranging discussion aimed at radical measures, all the huffing and puffing about Oxbridge is destined to remain a yearly ritual, each controversial headline simply making way for the same unsurprising headlines every year.
Friday, 16 June 2017
With Grenfell Tower, we’ve seen what ‘ripping up red tape’ really looks like
For years successive governments have built what they call a bonfire of regulations. They have argued that “red tape” impedes our freedom and damages productivity. Britain, they have assured us, would be a better place with fewer forms to fill in, fewer inspections and less enforcement.
But what they call red tape often consists of essential public protections that defend our lives, our futures and the rest of the living world. The freedom they celebrate is highly selective: in many cases it means the freedom of the rich to exploit the poor, of corporations to exploit their workers, landlords to exploit their tenants and industry of all kinds to use the planet as its dustbin. As RH Tawney remarked, “Freedom for the pike is death for the minnows.”
It will be a long time before we know exactly what caused the horrific fire in the Grenfell Tower, and why it was able to rage so freely, with such devastating loss of life. But it seems at this stage likely that the rapidity with which the fire spread was either caused or exacerbated by the cladding with which the tower was refurbished.
There have been plenty of warnings that cladding can present a severe fire risk. To give just one example, in 1999 the House of Commons select committee on environment, transport and rural affairs published a report entitled Potential Risk of Fire Spread in Buildings Via External Cladding Systems.
But both Conservative and New Labour governments have been highly reluctant to introduce new public protections, even when the need is pressing. They have been highly amenable to tearing down existing protections at the behest of trade associations and corporate lobbyists. Deregulation of this kind is a central theme of the neoliberal ideology to which both the Conservatives and Labour under Tony Blair succumbed.
In 2014, the then housing minister (who is now the immigration minister), Brandon Lewis, rejected calls to force construction companies to fit sprinklers in the homes they built on the following grounds:
Conservative MPs see Brexit as an excellent opportunity to strip back regulations
“In our commitment to be the first Government to reduce regulation, we have introduced the one in, two out rule for regulation … Under that rule, when the Government introduce a regulation, we will identify two existing ones to be removed. The Department for Communities and Local Government has gone further and removed an even higher proportion of regulations. In that context, Members will understand why we want to exhaust all non-regulatory options before we introduce any new regulations.”
In other words, though he accepted that sprinklers “are an effective way of controlling fires and of protecting lives and property”, to oblige builders to introduce them would conflict with the government’s deregulatory agenda. Instead, it would be left to the owners of buildings to decide how best to address the fire risk: “Those with responsibility for ensuring fire safety in their businesses, in their homes or as landlords, should and must make informed decisions on how best to manage the risks in their own properties,” Lewis said.
This calls to mind the Financial Times journalist Willem Buiter’s famous remark that “self-regulation stands in relation to regulation the way self-importance stands in relation to importance”. Case after case, across all sectors, demonstrates that self-regulation is no substitute for consistent rules laid down, monitored and enforced by government.
Crucial public protections have long been derided in the billionaire press as “elf ’n’ safety gone mad”. It’s not hard to see how ruthless businesses can cut costs by cutting corners, and how this gives them an advantage over their more scrupulous competitors.
Grenfell Tower fire is corporate manslaughter, says Labour MP
The “pollution paradox” (those corporations whose practices are most offensive to voters have to spend the most money on politics, with the result that their demands come to dominate political life) ensures that our protections are progressively dismantled by governments courting big donors.
Conservative MPs see Brexit as an excellent opportunity to strip back regulations. The speed with which the “great repeal bill” will have to pass through parliament (assuming that any of Theresa May’s programme can now be implemented) provides unprecedented scope to destroy the protections guaranteed by European regulations. The bill will rely heavily on statutory instruments, which permit far less parliamentary scrutiny than primary legislation. Unnoticed and undebated, crucial elements of public health and safety, workers’ rights and environmental protection could be made to disappear.
Too many times we have seen what the bonfire of regulations, which might sound like common sense when issuing from the mouths of ministers, looks like in the real world. The public protections that governments describe as red tape are what make the difference between a good society and barbarism. It is time to bring the disastrous deregulatory agenda to an end, and put public safety and other basic decencies ahead of corner-cutting and greed.
Saturday, 15 April 2017
Telling children 'hard work gets you to the top' is simply a lie
I know about social mobility: I went to underperforming state schools, and am now a barrister. Could somebody take the same route today? It’s highly unlikely
It is a common promise made to the next generation. “If you work hard, and do the right thing, you will be able to get on in life.” I believe that it is a promise that we have no capacity to fulfil. And that’s because its underlying assumptions must be revisited.
Imagine a life living in quads. You attend a highly prestigious school in which you dash from one quad to the next for your classes. You then continue on to yet another prestigious institution for your tertiary education, say Oxford or Cambridge University, and yet more quads with manicured lawns. Then you end up in the oasis of Middle Temple working as a barrister: more manicured lawns and, yes, you guessed it, more quads. You have clearly led a very square and straight life. Effortlessly gliding from one world to the next with clear continuity, familiarity and ease.
Now contrast the above oasis with the overcrowded and under-performing schools of inner cities, going home to a bedroom which you share with many other siblings. A home you are likely to vacate when the council can’t house you there anymore. Perhaps a single-parent household where you have caring duties at a young age, or a household where no one works. A difficult neighbourhood where the poverty of ambition is palpable, stable families a rarity, and role models very scarce.
The unwritten rules are rarely shared and 'diversity' and 'open recruitment' have made little if any difference
The former trajectory, in some or all its forms, is much more likely to lend itself to a more successful life in Britain. The latter means you may have the grades and talent, despite the odds, but you’re still lacking the crucial ingredients essential to succeeding. I don’t have to imagine much of this. I have experienced both of these extremes in my short lifetime.
My mother gave birth to 12 children. I arrived in London at the age of nine, speaking practically no English. I attended some of the worst performing schools in inner-city London and was raised exclusively on state benefits. Many years later I was lucky enough to attend Oxford on a full scholarship for my postgraduate degree. Now as a barrister I am a lifetime member of The Honourable Society of Lincoln’s Inn.
Is my route possible for anyone in the next generation with whom I share a similar background? I believe not. And this is not because they are any less able or less dedicated to succeed.
What I have learned in this short period of time is that the pervasive narrative of “if you work hard you will get on” is a complete myth. It’s not true and we need stop saying it. This is because “working hard, and doing the right thing” barely gets you to the starting line. Furthermore, it means something completely different depending on to which context you’re applying this particular notion. So much more is required.
I have come to understand that the systems that underpin the top professions in Britain are set up to serve only a certain section of society: they’re readily identifiable by privileged backgrounds, particular schools and accents. To some this may seem obvious, so writing it may be superfluous. But it wasn’t obvious to me growing up, and it isn’t obvious to many others. The unwritten rules are rarely shared and “diversity” and “open recruitment” have tried but made little if any difference.
Those inside the system then naturally recruit in their own image. This then entrenches the lack of any potential for upward mobility and means that the vast majority are excluded.
As a form of short-term distraction, we are obsessed with elevating token success stories which distort the overall picture. The story of the Somali boy who got a place at Eton, or the girl from the East End who is now going to MIT. These stories may seem inspiring at first blush, but they skew the complex picture that exists in deprived communities. It perpetuates the simple notion that what’s required is working hard, and that all else afterwards falls neatly into place. This simple ritual we seem to constantly engage in is therefore as much about setting up false hopes for other children, as it is about privileged, middle-class-led institutions making themselves feel good.
The reality is that there are many like them trying hard to do better, but may be lacking the environment to fully realise their potential. Are they worth less? When told to “dream big” and it will happen, who will tell them that failure had nothing to do with their lack of vision? But that real success, especially from their starting point, often boils down to a complex combination of circumstances: luck, sustained stability, the right teachers at the right time, and even not experiencing moments of grief at crucial, destabilising junctures.
Improving educational attainment is critical, and so much progress has been made over the years to improve this. But this is not enough. Employers must see hiring youngsters from poorer backgrounds as good for business as well as for a fairer society. They must be assisted with a real chance to succeed, in a non-judgmental context and inclusive environment. They must do more to focus on potential rather than polish. More leadership and more risk-taking are required on this front.
Perversely, class and accents remain an overwhelmingly important way of judging intelligence. In France or Germany, for example, your accent rarely matters. Your vocabulary and conjugation will give much more away, but never your accent, apart from regional perhaps. I don’t see this mindset shifting, so my advice to youngsters has remained: you need to adapt yourself. You need to find the right way to speak to different people, at different times in different contexts. This is not compromising who you are, but rather adapting to the relevant surroundings.
We need to do more to double down on improving environments both at home and at school which continuously constrain potential. If the adage that hard work truly matters rings true, then we must do more – at all levels of society – to make it a reality.
Wednesday, 4 January 2017
Supreme Court brings Indian cricket into the 21st century
The world’s most successful secret society has been given a lesson in transparency and that is cause for celebration.
No tears need be wasted on the panjandrums who have been running the Board of Control for Cricket in India and its State associations like personal fiefdoms.
The Supreme Court finally reeled in the long rope it had given the BCCI, and so tripped up its senior officials. If there was contrition among the officials, these remained unexpressed. Yesterday’s powerhouses will be tomorrow’s forgotten men, their frown and wrinkled lip and sneer of cold command erased forever.
Inevitably, some good men will be thrown out with the bad, and there will be much churning as the old order makes way for the new. The saner elements of the board will wonder if it had to come to this, when, with greater maturity and common sense, the BCCI might have emerged with some dignity.
For the BCCI brought about its own downfall, aided by nothing more than its hubris and cavalier disregard for the laws of the land. You cannot ignore a Supreme Court judgement, as the BCCI did, and hope that nothing will change. It wasn’t just arrogance, it was foolishness of the highest order.
Would past presidents like Chinnaswamy and Sriraman, Gaekwad and Bindra, Dungarpur and Dalmiya have allowed things to come to this pass? It is convenient to believe they wouldn’t. But there is false memory at play here, a harking back to a golden era that never existed. Ghulam Ahmed, former off spinner and board vice-president, put it succinctly, “There are no values in the board.”
The Anurag Thakurs and Ajay Shirkes are paying the price for the culture that men like those mentioned had brought into the BCCI. These men ran the best sports body in the country, and somehow believed that they had a divine right to do so. Players kowtowed to them, politicians and businessmen chased them, and they clung on to power with a touching desperation.
The current dispensation extended that culture and refined it. They, like their predecessors, failed to understand the connection between actions and consequences.
At any time in the BCCI’s eight-decade history, the Supreme Court could have stepped in and ruled as it did now. Accountability and transparency were never in the BCCI’s handbook for officials, but public scrutiny was not as intense as it is now, and in some cases the good that an official did outweighed the bad, and all was forgiven.
Brinkmanship — a tactic much favoured by the BCCI to bring other cricket boards and indeed the International Cricket Council to its knees — is not a strategy guaranteed to impress the Supreme Court. That the highest court gave the BCCI more than six months to comply with its order when it could have acted even as deadlines were ignored is a testimony to its benevolence.
But how did a three-time Member of Parliament, which is what Anurag Thakur is, and sundry other luminaries, misjudge the seriousness of the situation? Was this a proxy war fought on behalf of his political masters by Thakur, or was the board, recognising the inevitable, preparing for a scorched earth response? The first will have to remain in the realm of speculation till a lead actor in the drama spills the beans. We shall soon know about the second.
The BCCI’s death wish has been one of the features of the whole saga. Thakur came in as the bright, young face of the board. There was an energy about him which makes his fall a disappointment. At 42 he was the man who replaced the old guard. Yet, within weeks, the cozy club he had tried to break up when N. Sinivasan was in charge, quickly reshaped itself into a new cozy club.
His fall is a cautionary tale for those who set out to change the system but is absorbed by it. The Supreme Court’s ruling will also impact other sports which have been resisting change like the BCCI. And that is good news for Indian sport.
The domestic season has been unaffected by the BCCI’s problems. This has been the case traditionally, and is one of the true blessings of Indian cricket. There are enough dedicated officials to ensure that the show goes on.
A generational change has been forced upon the BCCI, which is otherwise happy to continue with sons and nephews (never daughters and nieces) and other relatives keeping everything in the family.
Now State associations will have to change their registrations where necessary, holding general body meetings in order to advance this. Legal procedures need to be followed. There is a temptation to believe that cricketers make the best administrators. This is a common fallacy. There are cricketers who have made excellent administrators, but being able to play the square cut is no guarantee of managerial skills. The names of corrupt cricketer-officials are well known.
There is a long road ahead, mostly uncharted. But a start has been made. The new system may not be perfect, but it is better than the old one. Accountability ensures that.
Tuesday, 3 March 2015
We’re desperate to believe in something. But bringing God into economics is risky
Meanwhile, leading Conservative thinkers Tim Montgomerie and Stefan Shakespeare have launched their “good right” initiative, which hopes to succeed where David Cameron has so obviously failed: to detoxify the Conservative brand. Making the Tories electable again is certainly the aim, but at its core is an even more ambitious endeavour: to re-establish the moral credibility of the free market. To this chorus of extra-parliamentary voices we might also add “blue Labour” Maurice Glasman and “red Tory” Phillip Blond and, for that matter, Russell Brand. Even if their ideas are unlikely to feature in forthcoming party manifestos, a movement is clearly afoot. This disparate group may differ on the remedy but share a diagnosis: the neoliberal revolution is politically and morally defunct. One way or another, they are all dancing on Thatcher’s grave.
But to those seeking a new moral vision for Britain, Thatcherism itself offers a cautionary tale. It was, much like now, a response to widespread disillusionment and a redundant political consensus. Like the “good right”, Conservatives in the 70s also sought to disconnect the association of collectivism with virtue and reinstate the moral integrity of the “invisible hand”. Margaret Thatcher would eventually cast herself as the shepherd leading the British people out of the dark days of decline towards the path of economic and social enlightenment. Ultimately, however, it was a story of false idols and unintended consequences – one where the mix of God, economics and single-minded vision proved to be toxic. The paradox of Thatcherism is that, like all political ideologies, there was a complete discrepancy between its aims and outcomes.
“Economics is the method; the object is to change the soul,” Margaret Thatcher declared in 1981, revealing the way in which Thatcherism for her was always about transforming values rather than simply GDP. A strong religious basis to her outlook stemmed from her father – the greengrocer, councillor and Wesleyan lay preacher, Alf Roberts.
If we were sourcing the origins of Thatcherism, we wouldn’t find it in the pages of Hayek’s Road to Serfdom or Milton Friedman’s monetarist theory but in Roberts’ sermon notes, now housed in Thatcher’s personal archive at Churchill College, Cambridge. Contained in them is the theological basis of Thatcherism: an individualistic interpretation of the Bible, a nod to the spiritual dangers of avarice, the Protestant work ethic, praise of the godly virtues of thrift and self-reliance and, finally, a divine justification for individual liberty and the free market. In short, Thatcherism always owed more to Methodism than to monetarism.
Thatcher herself had been a preacher before she entered politics, and even though she transferred this missionary energy from pulpit to podium, her religious values remained crucial. On becoming Conservative leader, she saw it as her chief mission to discredit the assumed moral superiority of socialism and reconnect the broken link between Protestant and capitalist values in Britain. Preaching from the pulpit on several occasions – most famously to the Church of Scotland’s General Assembly in 1988 – Thatcher unashamedly asserted the Biblical case for the sovereignty of individual liberty and the ‘invisible hand’. Thatcher’s pledge, of course, was that greater wealth would not encourage selfishness but neighbourliness. With more money in our pocket and less dependency on the state, we would be free to exercise our moral virtue and perform our duty as Good Samaritans.
We would not walk by on the other side, nor would we need state-imposed traffic lights to guide us there.
In the end, though, even she was prepared to admit she had failed in her crusade. When asked by Frank Field what her greatest regret in office was, she replied: “I cut taxes and I thought we would get a giving society, and we haven’t.” She was right. A survey conducted by the Charities Aid Foundation in 1989 revealed that those who gave the most to causes were not from the prosperous south but were disproportionately located in those areas that benefited least from the Thatcher boom.
FacebookTwitterPinterest Thatcher’s naivety was perhaps her greatest flaw.Photograph: ITV/Rex/ITV/Rex
Thatcher’s naivety was perhaps her greatest flaw: her understanding of capitalism for example was more a provincial than global one; Alf Roberts behind the counter of his grocery shop rather than the yuppie on the trading floor was the image of market transaction in her mind. It is little wonder then that she could not understand the world she had created, where the nation’s homes and household budgets were entwined with a global financial services sector that made up an ever-growing percentage of Britain’s GDP, largely internationally owned and in the hands of speculators concerned with short-term gain and distant from the deals and lives they were gambling on. In private Thatcher used to rage against bankers and their bonuses. Why did they not follow the example of those in the army she would cry, which in her view was the model demonstration of responsibility to one’s fellow man.
As someone reared in a home where profligacy was a vice and thrift a virtue, nor could Thatcher fathom why so many Britons struggled with debt. Yet paradoxically it was her government that did most to encourage it. What might be termed the “democratisation of debt”, be it in the form of credit and store cards, personal loans and of course, mortgages, fundamentally reordered the nation’s psyche and our attitudes towards money and the state. In short, we transferred our dependency from the government to the money-lenders. The notion of deferred gratification or thrift, that is saving for something before consuming it, became an alien concept for Britain’s “grab now, pay later” society. Total consumer credit more than doubled, while the number of credit cards nearly tripled in the 1980s and would spiral to unimaginable levels over the next two decades. This culture of credit too trickled down the social scale for as the government squeezed the benefits system so those low-income households turned to credit companies who asked few questions. In 1980 22% of households were using credit; by 1989 that had trebled to 69%, with an estimated 50% of those loans going on essentials. As the New Economics Foundation report of 2002 into debt recognised this led to the absurd situation whereby “what the taxpayer was providing in terms of benefits, the lender was often taking away – with interest”. It is doubtful that even Thatcher considered Britain’s record personal debt as part of her plan of “setting the people free”.
Thatcherism laid the foundations for a culture in which individualism and self-reliance could thrive, but ultimately it created a culture in which only selfishness and excess were rewarded. Thatcher liked to quote John Wesley’s mantra, “Earn all you can, save all you can and give all you can,” and yet it was only ever the first instruction that was sufficiently encouraged. While Cameron and Osborne have spoken at length about paying off the ‘nation’s credit card’, they have consciously avoided entreating individuals to pay off their own. Tellingly, it is now a vote-winner to talk of governmental thrift but political suicide to talk of personal thrift. That is the true legacy of Thatcherite economics.
When Thatcher said that there was ‘no such thing as society', it was a rallying cry for individual moral responsibility
When Thatcher uttered those now immortal words that there was “no such thing as society”, it was not a negative or flippant statement but a naive rallying cry for individual moral responsibility. Perhaps the flaw in her thinking was not that she did not believe in society but that she had too much faith in man.
Thatcher seemed to have forgotten the key doctrine in both Conservative philosophy and the Bible: the Fall. Thatcherism was a challenge to individual moral virtue, yet in Thatcher’s Eden, when given the choice, we – of course – ate the fruit. Where critics tend to go wrong in their assessment of Thatcher is that they do not consider that there was any moral, only economic, thinking behind it; where Thatcher’s admirers go wrong is that they do not admit that was a fundamental discrepancy between her aims and outcomes.
It is, of course, wrong to heap all the blame on Thatcher. This culture was encouraged and this behaviour continued unabated under New Labour. Much like a gangster’s wife who enjoys the lifestyle but does not question how her husband gets his money, Blair and Brown were content to pocket a significant share of the profits to fund their schools and hospitals.
By 2008 the world seemed on the precipice of something fundamental, but one of the remarkable features of the last seven years is how little has changed. Perhaps Thatcher’s great mistake was that, as Alfred Sherman said, “she saw life in primary colours”.
So there is credibility and value in dreaming up an alternative where Thatcher insisted that there was none. Given the contemporary disillusionment with capitalism, voters are still in desperate need of something to believe in. What the neoliberal experiment of the last 30 years teaches us is not that religion and politics do not mix, but that the politics of certainty is where danger lies.