Search This Blog

Showing posts with label right. Show all posts
Showing posts with label right. Show all posts

Monday 18 January 2021

Understanding Populism

Nadeem F Paracha in The Dawn


In a March 7, 2010 essay for the New York Times, the American linguist and author Ben Zimmer writes, “When politicians fret about the public perception of a decision more than the substance of the decision itself, we’re living in a world of optics.”

On the other hand, according to Deborah Johnson in the June 2017 issue of Attorney at Law, a politician may have the best interests of his constituents in mind, but he or she doesn’t come across smoothly because optics are bad, even though the substance is good. Johnson writes that things have increasingly slid from substance to optics.

Optics in this context have always played a prominent role in politics. Yet, it is also true that their usage has grown manifold with the proliferation of electronic and social media, and, especially, of ‘populism.’ Populists often travel with personal photographers so that they can be snapped and proliferate images that are positively relevant to their core audience.

Pakistan’s PM Imran Khan relies heavily on such optics. He is also considered to be a populist. But then why did he so stubbornly refuse to meet the mourning families of the 11 Hazara Shia miners who were brutally murdered in Quetta? Instead, the optics space in this case was filled by opposition leaders, Maryam Nawaz and Bilawal Bhutto.

Nevertheless, this piece is not about why an optics-obsessed PM such as Khan didn’t immediately occupy the space that was eventually filled by his opponents. It is more about exploring whether Khan really is a populist? For this we will have to first figure out what populism is.

According to the American sociologist, Bart Bonikowski, in the 2019 anthology When Democracy Trumps Populism, populism poses to be ‘anti-establishment’ and ‘anti-elite.’ It can emerge from the right as well as the left, but during its most recent rise in the last decade, it has mostly come up from the right.  

According to Bonikowski, populism of the right has stark ethnic or religious nationalist tendencies. It draws and popularises a certain paradigm of ‘authentic’ racial or religious nationalism and claims that those who do not have the required features to fit in this paradigm are outsiders and, therefore, a threat to the ‘national body.’ It also lashes out against established political forces and state institutions for being ‘elitist,’ ‘corrupt’ and facilitators of pluralism that is usurping the interests of the authentic members of the national body in a bid to undermine the ‘silent majority.’ Populism aspires to represent this silent majority, claiming to empower it.

Simply put, all this, in varying degrees, is at the core of populist regimes that, in the last decade or so, began to take shape in various countries — especially in the US, UK, India, Brazil, Turkey, Philippines, Hungary, Poland, Russia, Czech Republic and Pakistan. Yet, if anti-establishmentarian action and rhetoric is a prominent feature of populism, then what about populist regimes that are not only close to certain powerful state institutions, but were or are actually propped up by them? Opposition parties in Pakistan insist that Imran Khan’s party is propped up by the country’s military establishment, which is aiding it to remain afloat despite it failing on many fronts. The same is the case with the populist regime in Brazil.

Does this mean such regimes are not really populist? No. According to the economist Pranab Bardhan (University of California, Berkeley), even though populists share many similarities, populism’s shape can shift from region to region. Bardhan writes that characteristics of populism are qualitatively different in developed countries from those in developing countries. For example, whereas globalisation is seen in a negative light by populists in Europe and the US, a November 2016 survey published in The Economist shows that the people of 18 developing countries saw it positively, believing it gave their countries’ economies the opportunity to assert themselves.

Secondly, according to Bardhan, survey evidence suggests that much of the support for populist politics in developed countries is coming from less-educated, blue-collar workers, and from the rural backwaters. Populists in developing countries, by contrast, are deriving support mainly from the rising middle classes and the aspirational youth in urban areas. To Bardhan, in India, Pakistan, Turkey, Poland and Russia, symbols of ‘illiberal religious resurgence’ have been used by populist leaders to energise the upwardly-mobile or arriviste social groups.

He also writes that, in developed countries, populism is at loggerheads with the centralising state and political institutions, because it sees them as elitist, detached and a threat to local communities. But in developing countries, the populists have tried to centralise power and weaken local communities. To populists in developing countries, the main villains are not the so-called cold and detached state institutions, but ‘corrupt’ civilian parties. Ironically, while populism in the US is against welfare programmes, such programmes remain important to populists in developing countries.

Keeping this in mind, one can conclude that PM Khan is a populist, quite like his populist contemporaries in other developing countries. Despite nationalist rhetoric and his condemnatory understanding of colonialism, globalisation that promises foreign investment in the country is welcomed. His main base of support remains aspirational and upwardly-mobile urban middle-class segments. He often uses religious symbology and exhibitions of piety to energise this segment, providing religious context to what are actually Western ideas of state, governance, economics and nationalism. For example, the Scandinavian idea of the welfare state that he admires is defined as Riyasat-i-Madina (State of Madina).

Unlike populism in Europe and the US, populism in developing countries embraces the ‘establishment’ and, instead, turns its guns towards established political parties which it describes as being ‘corrupt.’ Khan is no different. He admires the Chinese system of central planning and economy and dreams of a centralised system that would seamlessly merge the military, the bureaucracy and his government into a single ruling whole. His urban middle-class supporters often applaud this ‘vision.’

Tuesday 15 December 2020

Do rich countries undermine democracies in developing countries? Economic History in Small Doses 3

Girish Menon*

The IMF led consortia (World Bank, WTO…), have represented the interests of the rich countries. Historically, they have advocated free market policies in developing countries. Whenever such weak economies got into economic trouble the consortia have insisted on harsh policy changes in return for their help. By such acts, are the rich countries really helping the growth of democracy in developing countries?

---Also read




---

Free market policies have brought more areas of our life under the ‘one rupee one vote’ rule of the market. Let us examine some of these policies:

The argument is framed thus, “politics opens the door for perversion of market rationality; inefficient firms or farmers by lobbying their politicians for subsidies will impose costs on the rest of society that has to buy expensive domestic products.” The current farmers’ agitation in India is being tarried with this brush.

The free marketer’s solution is to ‘depoliticize’ the economy. They argue that the very scope of government activity should be reduced to a minimal state through privatisation and liberalisation. This is necessary, they argue, because the politicians are less competent and more corrupt. Hence, it is important for developing countries to sign up to international agreements like the WTO, bilateral/free trade agreements like RCEP or TPP so that domestic politicians lose their ability to take democratic decisions.

The main problem with this argument for depoliticization is the assumption that we definitely know the limits where politics should end and where economics should begin. This is a fundamental fallacy.

Markets are political constructs; the recognition of private ownership of property and other rights that underpin them have political origins. This becomes evident when viewed historically. For example: certain tribes have lived in the woods for centuries until the point when this land is sold off by the government to a private landowner and then these tribespeople now become trespassers on the same land. Or the re-designation of slaves from capital to labour was also a political act. In other words the political origins of economic rights can be seen in the fact that many of these rights that seem natural today were once hotly contested in the past.

Thus when free marketers propose de-politicizing the economy they argue that everybody else accept their demarcation between economics and politics. I agree with Ha Joon Chang when he argues that ‘depoliticization of policy decisions in a democratic polity means – let’s not mince our words – weakening democracy.’

In other words, democracy is acceptable to free-marketers only if it does not contradict their free market doctrine. They want democracy only if it is largely powerless. Deep down they believe that giving political power to those who do not have a stake in the free market system will result in an ‘irrational’ modification of property and other economic rights. And the free-marketers spread their gospel by subtly discrediting democratic politics without openly criticising democracy.

The consequences have been damaging in developing countries, where the free-marketers have been able to push through anti-democratic actions well beyond what would be acceptable in rich countries.


* Adapted and simplified by the author from Ha Joon Chang's Bad Samaritans - The Guilty Secrets of Rich Nations & The Threat to Global Prosperity

Monday 8 June 2020

We often accuse the right of distorting science. But the left changed the coronavirus narrative overnight

Racism is a health crisis. But poverty is too – yet progressives blithely accepted the costs of throwing millions of people like George Floyd out of work writes Thomas Chatterton Williams in The Guardian


 
‘Less than two weeks ago, the enlightened position was to exercise extreme caution. Many of us went further, taking to social media to shame others for insufficient social distancing.’ Photograph: Devon Ravine/AP


When I reflect back on the extraordinary year of 2020 – from, I hope, some safer, saner vantage – one of the two defining images in my mind will be the surreal figure of the Grim Reaper stalking the blazing Florida shoreline, scythe in hand, warning the sunbathing masses of imminent death and granting interviews to reporters. The other will be a prostrate George Floyd, whose excruciating Memorial Day execution sparked a global protest movement against racism and police violence.

Less than two weeks after Floyd’s killing, the American death toll from the novel coronavirus has surpassed 100,000. Rates of infection, domestically and worldwide, are rising. But one of the few things it seems possible to say without qualification is that the country has indeed reopened. For 13 days straight, in cities across the nation, tens of thousands of men and women have massed in tight-knit proximity, with and without personal protective equipment, often clashing with armed forces, chanting, singing and inevitably increasing the chances of the spread of contagion.

Scenes of outright pandemonium unfold daily. Anyone claiming to have a precise understanding of what is happening, and what the likely risks and consequences may be, should be regarded with the utmost skepticism. We are all living in a techno-dystopian fantasy, the internet-connected portals we rely on rendering the world in all its granular detail and absurdity like Borges’s Aleph. Yet we know very little about what it is we are watching.

I open my laptop and glimpse a rider on horseback galloping through the Chicago streets like Ras the Destroyer in Ralph Ellison’s Invisible Man; I scroll down further and find myself in Los Angeles, as the professional basketball star JR Smith pummels a scrawny anarchist who smashed his car window. I keep going and encounter a mixed group of business owners in Van Nuys risking their lives to defend their businesses from rampaging looters; the black community members trying to help them are swiftly rounded up by police officers who mistake them for the criminals. In Buffalo, a 75-year-old white man approaches a police phalanx and is immediately thrown to the pavement; blood spills from his ear as the police continue to march over him. Looming behind all of this chaos is a reality-TV president giddily tweeting exhortations to mass murder, only venturing out of his bunker to teargas peaceful protesters and stage propaganda pictures.


George Floyd wasn’t merely killed for being black – he was also killed for being poor

But this virus – for which we may never even find a vaccine – knows and respects none of this socio-political context. Its killing trajectory isn’t rational, emotional, or ethical – only mathematical. And just as two plus two is four, when a flood comes, low-lying areas get hit the hardest. Relatively poor, densely clustered populations with underlying conditions suffer disproportionately in any environment in which Covid-19 flourishes. Since the virus made landfall in the US, it has killed at least 20,000 black Americans.

After two and a half months of death, confinement, and unemployment figures dwarfing even the Great Depression, we have now entered the stage of competing urgencies where there are zero perfect options. Police brutality is a different if metaphorical epidemic in an America slouching toward authoritarianism. Catalyzed by the spectacle of Floyd’s reprehensible death, it is clear that the emergency in Minneapolis passes my own and many peoples’ threshold for justifying the risk of contagion.

But poverty is also a public health crisis. George Floyd wasn’t merely killed for being black – he was also killed for being poor. He died over a counterfeit banknote. Poverty destroys Americans every day by means of confrontations with the law, disease, pollution, violence and despair. Yet even as the coronavirus lockdown threw 40 million Americans out of work – including Floyd himself – many progressives accepted this calamity, sometimes with stunning blitheness, as the necessary cost of guarding against Covid-19.

The new, “correct” narrative about public health – that one kind of crisis has superseded the other – grows shakier as it spans out from Minnesota, across America to as far as London, Amsterdam and Paris – cities that have in recent days seen extraordinary manifestations of public solidarity against both American and local racism, with protesters in the many thousands flooding public spaces.

Consider France, where I live. The country has only just begun reopening after two solid months of one of the world’s severest national quarantines, and in the face of the world’s fifth-highest coronavirus body count. As recently as 11 May, it was mandatory here to carry a fully executed state-administered permission slip on one’s person in order to legally exercise or go shopping. The country has only just begun to flatten the curve of deaths – nearly 30,000 and counting – which have brought its economy to a standstill. Yet even here, in the time it takes to upload a black square to your Instagram profile, those of us who move in progressive circles now find ourselves under significant moral pressure to understand that social distancing is an issue of merely secondary importance.

This feels like gaslighting. Less than two weeks ago, the enlightened position in both Europe and America was to exercise nothing less than extreme caution. Many of us went much further, taking to social media to castigate others for insufficient social distancing or neglecting to wear masks or daring to believe they could maintain some semblance of a normal life during coronavirus. At the end of April, when the state of Georgia moved to end its lockdown, the Atlantic ran an article with the headline “Georgia’s Experiment in Human Sacrifice”. Two weeks ago we shamed people for being in the street; today we shame them for not being in the street.

As a result of lockdowns and quarantines, many millions of people around the world have lost their jobs, depleted their savings, missed funerals of loved ones, postponed cancer screenings and generally put their lives on hold for the indefinite future. They accepted these sacrifices as awful but necessary when confronted by an otherwise unstoppable virus. Was this or wasn’t this all an exercise in futility?

“The risks of congregating during a global pandemic shouldn’t keep people from protesting racism,” NPR suddenly tells us, citing a letter signed by dozens of American public health and disease experts. “White supremacy is a lethal public health issue that predates and contributes to Covid-19,” the letter said. One epidemiologist has gone even further, arguing that the public health risks of not protesting for an end to systemic racism “greatly exceed the harms of the virus”.

The climate-change-denying right is often ridiculed, correctly, for politicizing science. Yet the way the public health narrative around coronavirus has reversed itself overnight seems an awful lot like … politicizing science.

What are we to make of such whiplash-inducing messaging? Merely pointing out the inconsistency in such a polarized landscape feels like an act of heresy. But “‘Your gatherings are a threat, mine aren’t,’ is fundamentally illogical, no matter who says it or for what reason,” as the author of The Death of Expertise, Tom Nichols, put it. “We’ve been told for months to stay as isolated as humanely possible,” Suzy Khimm, an NBC reporter covering Covid-19, noted, but “some of the same public officials and epidemiologists are [now] saying it’s OK to go to mass gatherings – but only certain ones.”

Public health experts – as well as many mainstream commentators, plenty of whom in the beginning of the pandemic were already incoherent about the importance of face masks and stay-at-home orders – have hemorrhaged credibility and authority. This is not merely a short-term problem; it will constitute a crisis of trust going forward, when it may be all the more urgent to convince skeptical masses to submit to an unproven vaccine or to another round of crushing stay-at-home orders. Will anyone still listen?

Seventy years ago Camus showed us that the human condition itself amounts to a plague-like emergency – we are only ever managing our losses, striving for dignity in the process. Risk and safety are relative notions and never strictly objective. However, there is one inconvenient truth that cannot be disputed: more black Americans have been killed by three months of coronavirus than the number who have been killed by cops and vigilantes since the turn of the millennium. We may or may not be willing to accept that brutal calculus, but we are obligated, at the very least, to be honest.

Wednesday 20 May 2020

Returning to work in the coronavirus crisis: what are your rights?

Hilary Osborne in The Guardian 


 
Some people may be concerned about returning to work during the coronavirus crisis. Photograph: Matthew Horwood/Getty Images


As the lockdown restrictions begin to be eased across the UK, more workers are being asked to return to the workplace.

The government has said that employees should only be asked to go back if they cannot do their job from home, so if you can, your employer should not be asking you to travel in to work.

If you do need to go to your workplace, your employer is obliged to make sure you will be safe there. Employment lawyer Matt Gingell says: “Employers have a general duty to ensure, as far as reasonably practicable, the health, safety and welfare of all of their employees.”

Here’s a guide to your rights if your employer wants you back in the workplace.

How much notice should I be given that I have to return?

“If employees are unable to work from home, employers can ask employees to return to work and, technically, no notice is required,” says Gingell.

Solicitor and consumer law expert Gary Rycroft says there is no notice period written into law “but giving at least 48 hours’ notice should allow either side to have discussions and air any concerns or even official ‘grievances’”.

The advisory group Acas says employers need to check if there are any arrangements in place with unions or similar about notice. It advises: “Employees and workers should be ready to return to work at short notice, but employers should be flexible where possible.”

So while your employer could ask you to return straight away, a good employer would understand if there were things you needed to put in place first, and give you chance to do so.

What if I was furloughed?

When you were furloughed your employer should have outlined what would happen when it wanted you to go back to work, and this may have a clause saying that you have to return as soon as you are asked.

“The termination of the furlough agreement and when an employee will be expected to return to work will depend on the provisions of the agreement,” says Gingell. Again, though, even if there is no notice period, a good employer should realise that you may need some time to prepare.

If you have been furloughed under the government’s job retention scheme, your employer can’t ask you to go in and do ad hoc days, or work part-time. They would need to take you off furlough and renegotiate your contract with you.

Can they ask me to go back in part-time?

Not, currently, if you have been furloughed and they are using the government scheme to pay you. It only allows companies to furlough people for all of their normal hours, and bans them from asking you to do any work while you are off.

But if your company has not claimed government money to cover your wages, it can ask you to resume work part-time. Make sure you understand the terms of the request – your employer cannot adjust your contract without your permission, so if it is asking you to change your hours you should get advice.

Can they ask me to take a pay cut?

“The law here is the same as it would be if an employer made the same request in the normal course of an employee’s employment. Reducing hours and/or pay are deemed to be such fundamental changes to an employee’s terms and conditions that the employee concerned should be consulted and then agree in writing,” says Rycroft.

He points out that for some employers “this may be the only economically viable option”, and the alternative, if people refuse, could be redundancies. To make more than 20 people redundant there will need to be collective consultation.

What if I am in a vulnerable group or live with someone who is?

No special rules have been put in place to protect people in these groups who are asked to go into work but some already exist – if you are disabled or pregnant, for example, your employer has extra obligations.

Rycroft says some employees may be able to argue that it will be discriminatory to force them to attend work outside the home. “It is all a question of degrees, in terms of how the employer can show that they have listened to legitimate concerns and made reasonable adjustments,” he says.

If you are pregnant your employer is obliged to make sure you can do your job safely. This can mean allowing you to do your job from home, or giving you a new role which can be done remotely. If your employer refuses either of these options, and you do not feel safe going into work you should take advice. Employmentsolicitor.com says that you could be able to argue for a medical suspension on full pay, which will allow you to stay at home.

Living with someone who is vulnerable or especially at risk is not necessarily a reason an employee can refuse to return to work, says Rycroft. “However, you can, as an employee raise a grievance and ask to be listened to and hopefully a compromise may be agreed, such as unpaid leave or using up annual holiday. But if an employer can show that a workplace is safe, the employer may insist on an employee attending.”
What if I have childcare to worry about?

Legally, you can take time off to look after any dependants – these could be children, or older relatives. This time is typically unpaid. If you are currently furloughed and your employer does not have enough work for everyone to go back full-time, they may agree to leave you on furlough so you can continue to earn 80% of your normal pay.

What information should they give me in advance?

Rycroft says there is no law saying that employers should provide information before you return, but the government guidance to employers recommends that they do. He says this information – written or verbal – should cover how they are making your workplace safe in light of the pandemic. So you should be told what is happening to ensure social distancing and hygiene. “This will allow employees to understand how their health and safety at work is being addressed.

Can I refuse to go back?

Yes, if you believe there is a real danger to going to work. “If an employee refuses to return to the workplace due to the employee reasonably believing imminent and serious danger and is then dismissed for that reason the employee could, depending on the circumstances, have a claim for unfair dismissal,” Gingell says.

“The requirement that the employee has to believe that there is imminent and serious danger, does limit the right.”

Otherwise, you cannot refuse. “If someone refuses to attend work without a valid reason, it could result in disciplinary action,” says Acas. But you may be able to make other arrangements with your employer – perhaps you can use holiday or take unpaid leave, or if you have concerns about something like travelling at peak time, they may be willing to accommodate different shifts. Your employer does not have to agree to this, but it is worth asking.

What if I am worried when I see my workplace?

Rycroft says that under section 100 of the Employment Rights Act 1996 employees may leave a place of work where there is an imminent health and safety danger. So if, for example, you return to find social distancing is impossible, you could argue that this is a reason to leave your workplace.

But in the first instance you should try to resolve the issue with your boss. Gingell says: “Employers ought to to listen to the concerns of individuals and be sympathetic and understanding.”


If you do not get anywhere with this, you should take advice. If you are in a union, it should have a helpline you can call if there is no rep to speak to on site. Acas is another port of call, as is Citizens Advice.

“If the employer has breached the implied obligation to provide a safe working environment and/or trust and confidence an employee could, again, depending on the circumstances, resign swiftly as a result and claim constructive unfair dismissal,” says Gingell. But he says you should get advice before taking this action.

“Another option for employees to consider is contactIng the Health and Safety Executive, which enforces health and safety legislation,” he says.

Sunday 20 May 2018

Why the 'Right to Believe' Is Not a Right to Believe Whatever You Want

Daniel De Nicola in The Wire.In
Why the 'Right to Believe' Is Not a Right to Believe Whatever You Want


Do we have the right to believe whatever we want to believe? This supposed right is often claimed as the last resort of the wilfully ignorant, the person who is cornered by evidence and mounting opinion: ‘I believe climate change is a hoax whatever anyone else says, and I have a right to believe it!’ But is there such a right?

We do recognise the right to know certain things. I have a right to know the conditions of my employment, the physician’s diagnosis of my ailments, the grades I achieved at school, the name of my accuser and the nature of the charges, and so on. But belief is not knowledge.
Beliefs are factive: to believe is to take to be true. It would be absurd, as the analytic philosopher G.E. Moore observed in the 1940s, to say: ‘It is raining, but I don’t believe that it is raining.’ Beliefs aspire to truth – but they do not entail it. Beliefs can be false, unwarranted by evidence or reasoned consideration. They can also be morally repugnant. Among likely candidates: beliefs that are sexist, racist or homophobic; the belief that proper upbringing of a child requires ‘breaking the will’ and severe corporal punishment; the belief that the elderly should routinely be euthanised; the belief that ‘ethnic cleansing’ is a political solution, and so on. If we find these morally wrong, we condemn not only the potential acts that spring from such beliefs, but the content of the belief itself, the act of believing it, and thus the believer.

Such judgments can imply that believing is a voluntary act. But beliefs are often more like states of mind or attitudes than decisive actions. Some beliefs, such as personal values, are not deliberately chosen; they are ‘inherited’ from parents and ‘acquired’ from peers, acquired inadvertently, inculcated by institutions and authorities, or assumed from hearsay. For this reason, I think, it is not always the coming-to-hold-this-belief that is problematic; it is rather the sustaining of such beliefs, the refusal to disbelieve or discard them that can be voluntary and ethically wrong.

If the content of a belief is judged morally wrong, it is also thought to be false. The belief that one race is less than fully human is not only a morally repugnant, racist tenet; it is also thought to be a false claim – though not by the believer. The falsity of a belief is a necessary but not sufficient condition for a belief to be morally wrong; neither is the ugliness of the content sufficient for a belief to be morally wrong. Alas, there are indeed morally repugnant truths, but it is not the believing that makes them so. Their moral ugliness is embedded in the world, not in one’s belief about the world.

‘Who are you to tell me what to believe?’ replies the zealot. It is a misguided challenge: it implies that certifying one’s beliefs is a matter of someone’s authority. It ignores the role of reality. Believing has what philosophers call a ‘mind-to-world direction of fit’. Our beliefs are intended to reflect the real world – and it is on this point that beliefs can go haywire. There are irresponsible beliefs; more precisely, there are beliefs that are acquired and retained in an irresponsible way. One might disregard evidence; accept gossip, rumour, or testimony from dubious sources; ignore incoherence with one’s other beliefs; embrace wishful thinking; or display a predilection for conspiracy theories.

I do not mean to revert to the stern evidentialism of the 19th-century mathematical philosopher William K Clifford, who claimed: ‘It is wrong, always, everywhere, and for anyone, to believe anything upon insufficient evidence.’ Clifford was trying to prevent irresponsible ‘overbelief’, in which wishful thinking, blind faith or sentiment (rather than evidence) stimulate or justify belief. This is too restrictive. In any complex society, one has to rely on the testimony of reliable sources, expert judgment and the best available evidence. Moreover, as the psychologist William James responded in 1896, some of our most important beliefs about the world and the human prospect must be formed without the possibility of sufficient evidence. In such circumstances (which are sometimes defined narrowly, sometimes more broadly in James’s writings), one’s ‘will to believe’ entitles us to choose to believe the alternative that projects a better life.

In exploring the varieties of religious experience, James would remind us that the ‘right to believe’ can establish a climate of religious tolerance. Those religions that define themselves by required beliefs (creeds) have engaged in repression, torture and countless wars against non-believers that can cease only with recognition of a mutual ‘right to believe’. Yet, even in this context, extremely intolerant beliefs cannot be tolerated. Rights have limits and carry responsibilities.

Unfortunately, many people today seem to take great licence with the right to believe, flouting their responsibility. The wilful ignorance and false knowledge that are commonly defended by the assertion ‘I have a right to my belief’ do not meet James’s requirements. Consider those who believe that the lunar landings or the Sandy Hook school shooting were unreal, government-created dramas; that Barack Obama is Muslim; that the Earth is flat; or that climate change is a hoax. In such cases, the right to believe is proclaimed as a negative right; that is, its intent is to foreclose dialogue, to deflect all challenges; to enjoin others from interfering with one’s belief-commitment. The mind is closed, not open for learning. They might be ‘true believers’, but they are not believers in the truth.

Believing, like willing, seems fundamental to autonomy, the ultimate ground of one’s freedom. But, as Clifford also remarked: ‘No one man’s belief is in any case a private matter which concerns himself alone.’ Beliefs shape attitudes and motives, guide choices and actions. Believing and knowing are formed within an epistemic community, which also bears their effects. There is an ethic of believing, of acquiring, sustaining, and relinquishing beliefs – and that ethic both generates and limits our right to believe. If some beliefs are false, or morally repugnant, or irresponsible, some beliefs are also dangerous. And to those, we have no right.

Tuesday 1 May 2018

Should politicians be replaced by experts?

In the age of Trump and Brexit, some people say that democracy is fatally flawed and we should be ruled by ‘those who know best’. Here’s why that’s not very clever. David Runciman in The Guardian

Democracy is tired, vindictive, self-deceiving, paranoid, clumsy and frequently ineffectual. Much of the time it is living on past glories. This sorry state of affairs reflects what we have become. But current democracy is not who we are. It is just a system of government, which we built, and which we could replace. So why don’t we replace it with something better?

This line of argument has grown louder in recent years, as democratic politics has become more unpredictable and, to many, deeply alarming in its outcomes. First Brexit, then Donald Trump, plus the rise of populism and the spread of division, has started a tentative search for plausible alternatives. But the rival systems we see around us have a very limited appeal. The unlovely forms of 21st-century authoritarianism can at best provide only a partial, pragmatic alternative to democracy. The world’s strongmen still pander to public opinion, and in the case of competitive authoritarian regimes such as the ones in Hungary and Turkey, they persist with the rigmarole of elections. From Trump to Recep Tayyip ErdoÄŸan is not much of a leap into a brighter future.

There is a far more dogmatic alternative, which has its roots in the 19th century. Why not ditch the charade of voting altogether? Stop pretending to respect the views of ordinary people – it’s not worth it, since the people keep getting it wrong. Respect the experts instead! This is the truly radical option. So should we try it?

The name for this view of politics is epistocracy: the rule of the knowers. It is directly opposed to democracy, because it argues that the right to participate in political decision-making depends on whether or not you know what you are doing. The basic premise of democracy has always been that it doesn’t matter how much you know: you get a say because you have to live with the consequences of what you do. In ancient Athens, this principle was reflected in the practice of choosing office-holders by lottery. Anyone could do it because everyone – well, everyone who wasn’t a woman, a foreigner, a pauper, a slave or a child – counted as a member of the state. With the exception of jury service in some countries, we don’t choose people at random for important roles any more. But we do uphold the underlying idea by letting citizens vote without checking their suitability for the task.

Critics of democracy – starting with Plato – have always argued that it means rule by the ignorant, or worse, rule by the charlatans that the ignorant people fall for. Living in Cambridge, a passionately pro-European town and home to an elite university, I heard echoes of that argument in the aftermath of the Brexit vote. It was usually uttered sotto voce – you have to be a brave person to come out as an epistocrat in a democratic society – but it was unquestionably there. Behind their hands, very intelligent people muttered to each other that this is what you get if you ask a question that ordinary people don’t understand. Dominic Cummings, the author of the “Take Back Control” slogan that helped win the referendum, found that his critics were not so shy about spelling it out to his face. Brexithappened, they told him, because the wicked people lied to the stupid people. So much for democracy.

To say that democrats want to be ruled by the stupid and the ignorant is unfair. No defender of democracy has ever claimed that stupidity or ignorance are virtues in themselves. But it is true that democracy doesn’t discriminate on the grounds of a lack of knowledge. It considers the ability to think intelligently about difficult questions a secondary consideration. The primary consideration is whether an individual is implicated in the outcome. Democracy asks only that the voters should be around long enough to suffer for their own mistakes.

The question that epistocracy poses is: why don’t we discriminate on the basis of knowledge? What’s so special about letting everyone take part? Behind it lies the intuitively appealing thought that, instead of living with our mistakes, we should do everything in our power to prevent them in the first place – then it wouldn’t matter who has to take responsibility.

This argument has been around for more than 2,000 years. For most of that time, it has been taken very seriously. The consensus until the end of the 19th century was that democracy is usually a bad idea: it is just too risky to put power in the hands of people who don’t know what they are doing. Of course, that was only the consensus among intellectuals. We have little way of knowing what ordinary people thought about the question. Nobody was asking them.

Over the course of the 20th century, the intellectual consensus was turned around. Democracy established itself as the default condition of politics, its virtues far outweighing its weaknesses. Now the events of the 21st century have revived some of the original doubts. Democracies do seem to be doing some fairly stupid things at present. Perhaps no one will be able to live with their mistakes. In the age of Trump, climate change and nuclear weapons, epistocracy has teeth again.

So why don’t we give more weight to the views of the people who are best qualified to evaluate what to do? Before answering that question, it is important to distinguish between epistocracy and something with which it is often confused: technocracy. They are different. Epistocracy means rule by the people who know best. Technocracy is rule by mechanics and engineers. A technocrat is someone who understands how the machinery works.

In November 2011, Greek democracy was suspended and an elected government was replaced by a cabinet of experts, tasked with stabilising the collapsing Greek economy before new elections could be held. This was an experiment in technocracy, however, not epistocracy. The engineers in this case were economists. Even highly qualified economists often haven’t a clue what’s best to do. What they know is how to operate a complex system that they have been instrumental in building – so long as it behaves the way it is meant to. Technocrats are the people who understand what’s best for the machine. But keeping the machine running might be the worst thing we could do. Technocrats won’t help with that question.

Both representative democracy and pragmatic authoritarianism have plenty of space for technocracy. Increasingly, each system has put decision-making capacity in the hands of specially trained experts, particularly when it comes to economic questions. Central bankers wield significant power in a wide variety of political systems around the world. For that reason, technocracy is not really an alternative to democracy. Like populism, it is more of an add-on. What makes epistocracy different is that it prioritises the “right” decision over the technically correct decision. It tries to work out where we should be going. A technocrat can only tell us how we should get there.

How would epistocracy function in practice? The obvious difficulty is knowing who should count as the knowers. There is no formal qualification for being a general expert. It is much easier to identify a suitable technocrat. Technocracy is more like plumbing than philosophy. When Greece went looking for economic experts to sort out its financial mess, it headed to Goldman Sachs and the other big banks, since that is where the technicians were congregated. When a machine goes wrong, the people responsible for fixing it often have their fingerprints all over it already.

Historically, some epistocrats have tackled the problem of identifying who knows best by advocating non-technical qualifications for politics. If there were such a thing as the university of life, that’s where these epistocrats would want political decision-makers to get their higher degrees. But since there is no such university, they often have to make do with cruder tests of competence. The 19th-century philosopher John Stuart Mill argued for a voting system that granted varying numbers of votes to different classes of people depending on what jobs they did. Professionals and other highly educated individuals would get six or more votes each; farmers and traders would get three or four; skilled labourers would get two; unskilled labourers would get one. Mill also pushed hard for women to get the vote, at a time when that was a deeply unfashionable view. He did not do this because he thought women were the equals of men. It was because he thought some women, especially the better educated, were superior to most men. Mill was a big fan of discrimination, so long as it was on the right grounds.

To 21st-century eyes, Mill’s system looks grossly undemocratic. Why should a lawyer get more votes than a labourer? Mill’s answer would be to turn the question on its head: why should a labourer get the same number of votes as a lawyer? Mill was no simple democrat, but he was no technocrat either. Lawyers didn’t qualify for their extra votes because politics placed a special premium on legal expertise. No, lawyers got their extra votes because what’s needed are people who have shown an aptitude for thinking about questions with no easy answers. Mill was trying to stack the system to ensure as many different points of view as possible were represented. A government made up exclusively of economists or legal experts would have horrified him. The labourer still gets a vote. Skilled labourers get two. But even though a task like bricklaying is a skill, it is a narrow one. What was needed was breadth. Mill believed that some points of view carried more weight simply because they had been exposed to more complexity along the way.

Jason Brennan, a very 21st-century philosopher, has tried to revive the epistocratic conception of politics, drawing on thinkers like Mill. In his 2016 book Against Democracy, Brennan insists that many political questions are simply too complex for most voters to comprehend. Worse, the voters are ignorant about how little they know: they lack the ability to judge complexity because they are so attached to simplistic solutions that feel right to them.

Brennan writes: “Suppose the United States had a referendum on whether to allow significantly more immigrants into the country. Knowing whether this is a good idea requires tremendous social scientific knowledge. One needs to know how immigration tends to affect crime rates, domestic wages, immigrants’ welfare, economic growth, tax revenues, welfare expenditures and the like. Most Americans lack this knowledge; in fact, our evidence is that they are systematically mistaken.”

In other words, it’s not just that they don’t know; it’s not even that they don’t know that they don’t know; it’s that they are wrong in ways that reflect their unwavering belief that they are right.

 
Some philosophers advocate exams for voters, to ‘screen out citizens who are badly misinformed’. Photograph: David Jones/PA

Brennan doesn’t have Mill’s faith that we can tell how well-equipped someone is to tackle a complex question by how difficult that person’s job is. There is too much chance and social conditioning involved. He would prefer an actual exam, to “screen out citizens who are badly misinformed or ignorant about the election, or who lack basic social scientific knowledge”. Of course, this just pushes the fundamental problem back a stage without resolving it: who gets to set the exam? Brennan teaches at a university, so he has little faith in the disinterested qualities of most social scientists, who have their own ideologies and incentives. He has also seen students cramming for exams, which can produce its own biases and blind spots. Still, he thinks Mill was right to suggest that the further one advances up the educational ladder, the more votes one should get: five extra votes for finishing high school, another five for a bachelor’s degree, and five more for a graduate degree.

Brennan is under no illusions about how provocative this case is today, 150 years after Mill made it. In the middle of the 19th century, the idea that political status should track social and educational standing was barely contentious; today, it is barely credible. Brennan also has to face the fact that contemporary social science provides plenty of evidence that the educated are just as subject to groupthink as other people, sometimes even more so. The political scientists Larry Bartels and Christopher Achen point this out in their 2016 book Democracy for Realists: “The historical record leaves little doubt that the educated, including the highly educated, have gone wrong in their moral and political thinking as often as everyone else.” Cognitive biases are no respecters of academic qualifications. How many social science graduates would judge the question about immigration according to the demanding tests that Brennan lays out, rather than according to what they would prefer to believe? The irony is that if Brennan’s voter exam were to ask whether the better-educated deserve more votes, the technically correct answer might be no. It would depend on who was marking it.

However, in one respect Brennan insists that the case for epistocracy has grown far stronger since Mill made it. That is because Mill was writing at the dawn of democracy. Mill published his arguments in the run-up to what became the Second Reform Act of 1867, which doubled the size of the franchise in Britain to nearly 2.5 million voters (out of a general population of 30 million). Mill’s case for epistocracy was based on his conviction that over time it would merge into democracy. The labourer who gets one vote today would get more tomorrow, once he had learned how to use his vote wisely. Mill was a great believer in the educative power of democratic participation.

Brennan thinks we now have 100-plus years of evidence that Mill was wrong. Voting is bad for us. It doesn’t make people better informed. If anything, it makes them stupider, because it dignifies their prejudices and ignorance in the name of democracy. “Political participation is not valuable for most people,” Brennan writes. “On the contrary, it does most of us little good and instead tends to stultify and corrupt us. It turns us into civic enemies who have grounds to hate one another.” The trouble with democracy is that it gives us no reason to become better informed. It tells us we are fine as we are. And we’re not.

In the end, Brennan’s argument is more historical than philosophical. If we were unaware of how democracy would turn out, it might make sense to cross our fingers and assume the best of it. But he insists that we do know, and so we have no excuse to keep kidding ourselves. Brennan thinks that we should regard epistocrats like himself as being in the same position as democrats were in the mid-19th century. What he is championing is anathema to many people, as democracy was back then. Still, we took a chance on democracy, waiting to see how it would turn out. Why shouldn’t we take a chance on epistocracy, now we know how the other experiment went? Why do we assume that democracy is the only experiment we are ever allowed to run, even after it has run out of steam?

It’s a serious question, and it gets to how the longevity of democracy has stifled our ability to think about the possibility of something different. What was once a seemingly reckless form of politics has become a byword for caution. And yet there are still good reasons to be cautious about ditching it. Epistocracy remains the reckless idea. There are two dangers in particular.

The first is that we set the bar too high in politics by insisting on looking for the best thing to do. Sometimes it is more important to avoid the worst. Even if democracy is often bad at coming up with the right answers, it is good at unpicking the wrong ones. Moreover, it is good at exposing people who think they always know best. Democratic politics assumes there is no settled answer to any question and it ensures that is the case by allowing everyone a vote, including the ignorant. The randomness of democracy – which remains its essential quality – protects us against getting stuck with truly bad ideas. It means that nothing will last for long, because something else will come along to disrupt it.

Epistocracy is flawed because of the second part of the word rather than the first – this is about power (kratos) as much as it is about knowledge (episteme). Fixing power to knowledge risks creating a monster that can’t be deflected from its course, even when it goes wrong – which it will, since no one and nothing is infallible. Not knowing the right answer is a great defence against people who believe that their knowledge makes them superior.

Brennan’s response to this argument (a version of which is made by David Estlund in his 2007 book Democratic Authority) is to turn it on its head. Since democracy is a form of kratos, too, he says, why aren’t we concerned about protecting individuals from the incompetence of the demos just as much as from the arrogance of the epistocrats? But these are not the same kinds of power. Ignorance and foolishness don’t oppress in the same way that knowledge and wisdom do, precisely because they are incompetent: the demos keeps changing its mind.

The democratic case against epistocracy is a version of the democratic case against pragmatic authoritarianism. You have to ask yourself where you’d rather be when things go wrong. Maybe things will go wrong quicker and more often in a democracy, but that is a different issue. Rather than thinking of democracy as the least worst form of politics, we could think of it as the best when at its worst. It is the difference between Winston Churchill’s famous dictum and a similar one from Alexis de Tocqueville a hundred years earlier that is less well-known but more apposite. More fires get started in a democracy, de Tocqueville said, but more fires get put out, too.

The recklessness of epistocracy is also a function of the historical record that Brennan uses to defend it. A century or more of democracy may have uncovered its failings, but they have also taught us that we can live with them. We are used to the mess and attached to the benefits. Being an epistocrat like Mill before democracy had got going is very different from being one now that democracy is well established. We now know what we know, not just about democracy’s failings, but about our tolerance for its incompetences.

The great German sociologist Max Weber, writing at the turn of the 20th century, took it for granted that universal suffrage was a dangerous idea, because of the way that it empowered the mindless masses. But he argued that once it had been granted, no sane politician should ever think about taking it away: the backlash would be too terrible. The only thing worse than letting everyone vote is telling some people that they no longer qualify. Never mind who sets the exam, who is going to tell us that we’ve failed? Mill was right: democracy comes after epistocracy, not before. You can’t run the experiment in reverse.

The cognitive biases that epistocracy is meant to rescue us from are what will ultimately scupper it. Loss aversion makes it more painful to be deprived of something we have that doesn’t always work than something we don’t have that might. It’s like the old joke. Q: “Do you know the way to Dublin?” A: “Well, I wouldn’t start from here.” How do we get to a better politics? Well, maybe we shouldn’t start from here. But here is where we are.

Still, there must be other ways of trying to inject more wisdom into democratic politics than an exam. This is the 21st century: we have new tools to work with. If many of the problems with democracy derive from the business of politicians hawking for votes at election time, which feeds noise and bile into the decision-making process, perhaps we should try to simulate what people would choose under more sedate and reflective conditions. For instance, it may be possible to extrapolate from what is known about voters’ interests and preferences what they ought to want if they were better able to access the knowledge they needed. We could run mock elections that replicate the input from different points of view, as happens in real elections, but which strip out all the distractions and distortions of democracy in action.

Brennan suggests the following: “We can administer surveys that track citizens’ political preferences and demographic characteristics, while testing their basic objective political knowledge. Once we have this information, we can simulate what would happen if the electorate’s demographics remained unchanged, but all citizens were able to get perfect scores on tests of objective political knowledge. We can determine, with a strong degree of confidence, what ‘We the People’ would want, if only ‘We the People’ understood what we were talking about.”

Democratic dignity – the idea that all citizens should be allowed to express their views and have them taken seriously by politicians – goes out the window under such a system. We are each reduced to data points in a machine-learning exercise. But, according to Brennan, the outcomes should improve.

In 2017, a US-based digital technology company called Kimera Systems announced that it was close to developing an AI named Nigel, whose job was to help voters know how they should vote in an election, based on what it already knew of their personal preferences. Its creator, Mounir Shita, declared: “Nigel tries to figure out your goals and what reality looks like to you and is constantly assimilating paths to the future to reach your goals. It’s constantly trying to push you in the right direction.”

 
‘Politicians don’t care what we actually want. They care what they can persuade us we want’ … Donald Trump in Michigan last week. Photograph: Chirag Wakaskar/SOPA/Rex/Shutterstock

This is the more personalised version of what Brennan is proposing, with some of the democratic dignity plugged back in. Nigel is not trying to work out what’s best for everyone, only what’s best for you. It accepts your version of reality. Yet Nigel understands that you are incapable of drawing the correct political inferences from your preferences. You need help, from a machine that has seen enough of your personal behaviour to understand what it is you are after. Siri recommends books you might like. Nigel recommends political parties and policy positions.

Would this be so bad? To many people it instinctively sounds like a parody of democracy because it treats us like confused children. But to Shita it is an enhancement of democracy because it takes our desires seriously. Democratic politicians don’t much care what it is that we actually want. They care what it is they can persuade us we want, so they can better appeal to it. Nigel puts the voter first. At the same time, by protecting us from our own confusion and inattention, Nigel strives to improve our self-understanding. Brennan’s version effectively gives up on Mill’s original idea that voting might be an educative experience. Shita hasn’t given up. Nigel is trying to nudge us along the path to self-knowledge. We might end up learning who we really are.

The fatal flaw with this approach, however, is that we risk learning only who it is we think we are, or who it is we would like to be. Worse, it is who we would like to be now, not who or what we might become in the future. Like focus groups, Nigel provides a snapshot of a set of attitudes at a moment in time. The danger of any system of machine learning is that it produces feedback loops. By restricting the dataset to our past behaviour, Nigel teaches us nothing about what other people think, or even about other ways of seeing the world. Nigel simply mines the archive of our attitudes for the most consistent expression of our identities. If we lean left, we will end up leaning further left. If we lean right, we will end up leaning further right. Social and political division would widen. Nigel is designed to close the circle in our minds.

There are technical fixes for feedback loops. Systems can be adjusted to inject alternative points of view, to notice when data is becoming self-reinforcing or simply to randomise the evidence. We can shake things up to lessen the risk that we get set in our ways. For instance, Nigel could make sure that we visit websites that challenge rather than reinforce our preferences. Alternatively, on Brennan’s model, the aggregation of our preferences could seek to take account of the likelihood that Nigel had exaggerated rather than tempered who we really are. A Nigel of Nigels – a machine that helps other machines to better align their own goals – could try to strip out the distortions from the artificial democracy we have built. After all, Nigel is our servant, not our master. We can always tell him what to do.

But that is the other fundamental problem with 21st-century epistocracy: we won’t be the ones telling Nigel what to do. It will be the technicians who have built the system. They are the experts we rely on to rescue us from feedback loops. For this reason, it is hard to see how 21st-century epistocracy can avoid collapsing back into technocracy. When things go wrong, the knowers will be powerless to correct for them. Only the engineers who built the machines have that capacity, which means that it will be the engineers who have the power.

In recent weeks, we have been given a glimpse of what rule by engineers might look like. It is not an authoritarian nightmare of oppression and violence. It is a picture of confusion and obfuscation. The power of engineers never fully comes out into the open, because most people don’t understand what it is they do. The sight of Mark Zuckerberg, perched on his cushion, batting off the ignorant questions of the people’s representatives in Congress is a glimpse of a technocratic future in which democracy meets its match. But this is not a radical alternative to democratic politics. It is simply a distortion of it.


Sunday 15 April 2018

The right and left have both signed up to the myth of free market

Larry Elliot in The Guardian


 
Occupy Wall Street movement. After the financial crisis the public lost faith in the economics profession. Photograph: KeystoneUSA-Zuma/Rex Features


You can’t buck the market. The turn to the right taken by politics from the mid-1970s onwards was summed up in one phrase coined by Margaret Thatcher in 1988.

This idea tended to be associated with liberal economists such as Milton Friedman and Friedrich Hayek, both of whom influenced Thatcher deeply. Both thought that left to their own devices buyers and sellers would work out the price for everything, be that a loaf of bread, a wage, or an operation in the health service.

But economists and politicians who would certainly not have classified themselves as Hayekians, Friedmanites or Thatcherites also found the idea of market forces hard to resist. The new Keynesian school believed that there might be short-term impediments – or stickiness in the jargon – and that it was the job of government to deal with these market failures. But in the long term they too thought markets would return to equilibrium.

All sorts of policies flowed from this core belief: from privatisation to curbs on trade unions; from cuts in welfare to the attempt to create an internal market in the NHS. It also justified removing constraints on capital and the hands-off approach to financial regulation in the years leading up to the banking crisis of 2008. Anybody who suggested a gigantic bubble was being inflated was told that in free markets operated by perfectly rational economic agents this could not possibly happen.

Then the financial markets froze up. This was a classic emperor’s new clothes moment, when the public realised that the economics profession did not really have the foggiest idea that the biggest financial crisis in a century had been looming. Like a doctor who had said a patient was in rude health when she was actually suffering from a life-threatening disease, it had failed when it was most needed.
Just before Christmas, I wrote a column supporting the idea for some new thinking in economics. It caused quite a stir. Some economists liked it. Others hated it and rushed to their profession’s defence.

My piece said there was a need for a more plural approach to economics, with the need for a challenge to the dominant neo-classical school as a result of its egregious failure in 2008. The argument was that a bit of competition would do economics good.

If the latest edition of the magazine Prospect is anything to go by, a debate is now well under way. Howard Reed, an economist who has worked for two of the UK’s leading thinktanks, the Institute for Public Policy Research and the Institute for Fiscal Studies, says the malaise is so serious that a “deconomics” is needed that “decontaminates the discipline, deconstructs its theoretical heart, and rebuilds from first principles”.

Reed says a retooled economics would have four features. Firstly, recognition that there is no such thing as a value-free analysis of the economy. Neo-classical economics purports to be clean and pure but uses a cloak of ethical neutrality to make an an individualistic ethos the norm.
Secondly, he says too much economics is about how humans ought to behave rather than how they actually behave. Thirdly, economics needs to focus on the good life rather than on those areas most susceptible to analysis through 19th century linear mathematics. Finally, he calls for a more pluralistic approach. Economics should be learning from other disciplines rather than colonising them.

Prospect gave Diane Coyle, a Cambridge University economics professor, the right to reply to Reed’s piece and she does so with relish, calling it lamentable, a caricature and an ill-informed diatribe.

Most modern economics involves empirical testing, Coyle says, often using new sources of data. She rejects the idea that the profession is stuck in an ivory tower fiddling around with abstruse mathematics while ignoring the real world. Rather, it is “addressing questions of immediate importance and relevance to policymakers, citizens and businesses”.

Nor is it true that the discipline requires that people be rational, calculating automatons. “It very often has people interacting with each other rather than acting as atomistic individuals, despite Reed’s charge.”

Coyle accepts that macro-economics – the big picture stuff that involves looking at the economy in the round – is in a troubled state but says this is actually only a minority field.

The point that there are many economists doing interesting things in areas such as behavioural economics is a fair one, but Coyle is on shakier ground when she skates over the problems in macro-economics.

It was after all, macro-economists – the people working at the International Monetary Fund, the Federal Reserve, the European Central Bank, the Bank of England – that the public relied on to get things right a decade ago. All were blind to what was going on, and that had quite a lot to do with their “markets tend to know best” belief system.

Policy makers did not find the works of Hayek and Friedman particularly useful when a second great depression was looming. Instead, they turned, if only fleetingly, to Keynes’s general theory, which told them it would not be wise to wait for market forces to do their work.

Reed’s argument is not just that blind faith in neo-classical economics led to the crisis. Nor is it simply that the systemic failure of 2008 means there is a need for a root-and-branch rethink. It is also that mainstream economics has been serving the interests of the political right.

Some in the profession, particularly those who see themselves as progressives, appear to have trouble with this idea. That perhaps explains why they have been so rattled by even the teeniest bit of criticism.

Thursday 30 November 2017

Let Hadiya take charge of her life

Brinda Karat in The Hindu

The Supreme Court did not allow itself to be converted into a khap panchayat, although it came close to it on Tuesday as it heard the Hadiya case. The counsel for the National Investigation Agency (NIA) supported by the legal counsel of the Central government made out a case of indoctrination and brainwashing in a conspiracy of ‘love jehad’ which they claimed rendered Hadiya incapacitated and invalidated her consent. The NIA wanted the court to study the documents it claimed it had as evidence before they heard Hadiya. For one and a half hours, this young woman stood in open court hearing arguments about herself, against herself and her chosen partner. It was shameful, humiliating and set an unfortunate precedent. If the court was not clear that it wanted to hear her, why did they call her at all? She should never have been subjected to that kind of indignity. She is not a criminal but she was treated like one for that period of time.


The right to speak

The court remained undecided even in the face of the compelling argument by lawyers Kapil Sibal and Indira Jaising representing Hadiya’s husband Shafin Jahan that the most critical issue was that of the right of an adult woman to make her own choice. The court almost adjourned for the day when the Kerala State Women’s Commission lawyer, P. V. Dinesh, raised a voice of outrage that after all the accusations against Hadiya in the open court if the court did not hear her, it would be a grave miscarriage of justice. In khap panchayats, the woman accused of breaking the so-called honour code is never allowed to speak. Her sentence begins with her enforced silence and ends with whatever dreadful punishment is meted out to her by the khap. Fortunately the Supreme Court pulled itself back from the brink and agreed to give Hadiya an opportunity to speak.

There was no ambiguity about what she said. It was the courage of her conviction that stood out. She wanted to be treated as a human being. She wanted her faith to be respected. She wanted to study. She wanted to be with her husband. And most importantly, she wanted her freedom.

The court listened, but did it hear?

Both sides claim they are happy with the order. Hadiya and her husband feel vindicated because the court has ended her enforced custody by her father. She has got an opportunity to resume her studies. Lawyers representing the couple’s interests have explained that the first and main legal strategy was to ensure her liberty from custody which has been achieved. They say that the order places no restrictions on Hadiya meeting anyone she chooses to, including her husband. It is a state of interim relief.

Her father claims victory because the court did not accept Hadiya’s request to leave the court with her husband. Instead the court directed that she go straight to a hostel in Salem to continue her studies. He asserted this will ensure that she is not with her husband who he has termed a terrorist.

The next court hearing is in January and the way the court order is implemented will be clear by then.

The case reveals how deeply the current climate created by sectarian ideologies based on a narrow reading of religious identity has pushed back women’s rights to autonomy as equal citizens. From the government to the courts, to the strengthening of conservative and regressive thinking and practice, it’s all out there in Hadiya’s case.

One of the most disturbing fallouts is that the term ‘love jehad’ used by Hindutva zealots to target inter-faith marriages has been given legal recognition and respectability by the highest courts. An agency whose proclaimed mandate is to investigate offences related to terrorism has now expanded its mandate by order of the Supreme Court to unearth so-called conspiracies of Muslim men luring Hindu women into marriage and forcibly converting them with the aim of joining the Islamic State. The underlying assumption is that Hindu women who marry Muslims have no minds of their own. If they convert to Islam, that itself is proof enough of a conspiracy.

This was clearly reflected in the regressive order of the Kerala High Court in May this year which annulled Hadiya’s marriage. Among other most objectionable comments it held that a woman of 24 is “weak and vulnerable”, that as per Indian tradition, the custody of an unmarried daughter is with the parents, until she is properly married.” Equally shocking, it ordered that nobody could meet her except her parents in whose custody she was placed.

Not a good precedent

Courts in this country are expected to uphold the right of an adult woman to her choice of a partner. Women’s autonomy and equal citizenship rights flow from the constitutional framework, not from religious authority or tradition. The Kerala High Court judgement should be struck down by the apex court. We cannot afford to have such a judgment as legal precedent.

The case also bring into focus the right to practice and propagate the religion of one’s choice under the Constitution. In Hadiya’s case she has made it clear time and again that she converted because of her belief in Islam. It is not a forcible conversion. Moreover she converted at least a year before her marriage. So the issue of ‘love jehad’ in any case is irrelevant and the court cannot interfere with her right to convert.

As far as the NIA investigation is concerned, the Supreme Court has ordered that it should continue. The Kerala government gave an additional affidavit in October stating that “the investigation conducted so far by the Kerala police has not revealed any incident relating to commission of any scheduled offences to make a report to the Central government under Section 6 of the National Investigation Agency Act of 2008.” The State government said the police investigation was on when the Supreme Court directed the NIA to conduct an investigation into the case. It thus opposed the handing over of the case to the NIA. In the light of this clear stand of the Kerala government, it is inexplicable why its counsel in the Supreme Court should take a contrary stand in the hearing — this should be rectified at the earliest.

Vigilantism by another name

The NIA is on a fishing expedition having already interrogated 89 such couples in Kerala. Instead of inter-caste and inter-community marriages being celebrated as symbols of India’s open and liberal approach, they are being treated as suspect.

Now, every inter-faith couple will be vulnerable to attacks by gangs equivalent to the notorious gau rakshaks. This is not just applicable to cases where a Hindu woman marries a Muslim. There are bigots and fanatics in all communities. When a Muslim woman marries a Hindu, Muslim fundamentalist organisations like the Popular Front of India use violent means to prevent such marriages. Sworn enemies, such as those who belong to fundamentalist organisations in the name of this or that religion, have more in common with each other than they would care to admit.

Hopefully the Supreme Court will act in a way which strengthens women’s rights unencumbered by subjective interpretations of tradition and communal readings of what constitutes national interest.

Sunday 3 September 2017

Why have rights if workers fear using them?

Low-wage employees rarely claim what they’re entitled to for fear of being branded troublemakers by their boss


Barbara Ellen in The Guardian



It may be time to quash the myth, once and for all, that the only reason that low-wage workers don’t exercise their employment rights is because they don’t know about them. Perhaps even when they do, they fear that to exercise them would risk them being branded troublemakers and penalised. It’s also possible that employers know about this fear and cynically exploit it.

A TUC study has found that many low-paid workers (people with combined household incomes of £28,000 or less) are being “disciplined” for taking childcare-related time off. Forty-two per cent of parents felt they’d been stigmatised and punished for asking for more flexible hours, with some worrying that they would be given worse shifts or even lose their jobs, and 29% were dipping into annual leave when their children fell ill.

The report said that many of the low-waged seemed unaware that they had a legal right to 18 weeks’ unpaid parental leave if they’d held their jobs for a year, though these rights did not apply to everybody and could be impractical for those on zero-hour contracts (where shifts could change on an employer’s whim). What’s more, the rights became “meaningless” if workers felt that utilising them could lead to them being branded provocative, unreliable and, ultimately, unemployable.

Perhaps it’s time there was a new kind of narrative from the world of work and children, one that goes beyond even the unfolding shambles of the government’s election nursery care promises. Generally, it’s all about women being forced off the “fast track” on to the “mummy track” once they become parents or cyclical outbursts about how the macho nature of work culture isn’t conducive to parenthood or the holy grail of work/life balance.

Then there’s the saga of “parents versus non-parents”. There are tales of office-based sniping about parents arriving for work late, and leaving early, with parents feeling stressed, resented and misunderstood, and non-parents feeling that parents get far too much special consideration for their little darlings’ bouts of tonsillitis.

While all of these points remain valid, the TUC findings show a different world, one where the very notion of “work-life” balance would be considered a tasteless joke and where a working parent’s struggles doesn’t just mean a few covert snotty looks from the marketing team when they leave early for the school nativity play. It sometimes means the choice between using your holiday to look after your children when they’re ill or risking irritating employers and ending up on some unofficial shit list.

This situation appears to be multifaceted. Many low-wage workers in insecure positions don’t have the legal right to ask for more flexible hours (as well as lacking many other legal protections). Those who have rights may not be aware of them. And those who are aware of their rights may be afraid to use them, understandably so.

All of which leads to another issue, one flagged up by the TUC report, which undoubtedly affects every move a low-wage parent worker makes – that employers definitely know their workers’ rights.

However, they also know that they have the upper hand in this ugly era of sanctioned worker exploitation. And so even if a lone voice does dare to raise itself, it could be effectively muzzled and silenced, just with the unspoken threat of the worker being stigmatised, penalised, even losing the job altogether.

There it is: not just the problem, but the disgrace, the human rights calamity of low-wage, insecure British work culture in the 21st century. What use are employment rights when there’s a thriving culture of workers being systematically intimidated into disregarding them?

Thursday 4 May 2017

How strange that capitalism’s noisiest enemies are now on the right

Giles Fraser in The Guardian

Listening to Marine Le Pen attack Emmanuel Macron for being a creature of global finance is a reminder of a disturbing feature of modern political life: the extent to which the attack upon capitalism has migrated from the left to the right.

There was a time, not so very long ago, when it was widely accepted that the job of the left was to explain how free-market capitalism is bad for the poor and bad for social cohesion more generally. The left was supposed to show that in free markets, wealth doesn’t trickle down, it bubbles up. That trusting the invisible hand to spread wealth all round is like trusting bankers to share their bonuses with their neighbours. And, moreover, that inequalities of wealth created by the free-market system creates a society profoundly ill at ease with itself. This is why socialists have always believed in the public ownership of the means of production and of the major public services. Markets and money should exist to serve people, not the other way round. The importance of democratic socialism is that it uses the power of the ballot box to assert the will of people over the will of capital. 

The EU debate, now breaking out all over Europe, has flushed out the extent to which the so-called left, now overrun by liberalism, has largely abandoned this historical position. In this country, the liberal left now believes that support for the single market and economic free trade is the very thing that distinguishes them from a so-called hard Tory Brexit. This is an astonishing change of position. It used to be obvious to democratic socialists that the terms of international trade should be set not by the market alone but also by democratically elected governments subject to the will of their electorates. But the liberal left, perhaps not trusting how ordinary people (as opposed to more enlightened economic “experts”) might vote, thinks that trade should be free of the irritating interventions of democratic accountability. They want it to be frictionless – an irritating euphemism that ultimately means: not subject to will of the people.

Jeremy Corbyn aside, one of the tragedies of the leftwing abandonment of its traditional suspicion of capitalism is that the far right has now filled the vacuum. It understands that the bubbling resentment of rundown estates and forgotten seaside towns can be harnessed and turned against foreigners and Islam as well as the liberal capitalist establishment. This, of course, only serves to secure in the minds of the liberal left how dangerous it was in the first place to challenge the basic premise of capitalism: the freedom of money to go where it will, unimpeded, untaxed, unbothered. What a topsy-turvy political world we now inhabit. Squint your eyes and it almost looks as though the left has become the right, and the right has become the left.

Perhaps a word about terminology is helpful, because liberalism is a slippery idea. Liberals are distinguished above all by their belief in freedom – the freedom to be who you want to be (social liberalism) and the freedom to make and keep as much money as you want (economic liberalism) existing on the same continuum. As much as possible, the state should not stand in the way of, or make any sort of judgment about, the wants and desires of free individuals. But what liberals don’t see, or don’t want to see, is that their little individual freedoms are also collectively responsible for the boarded-up shops of Walsall and the disintegration of communities such as mine in south London.

Even if you disagree with my take on liberalism, you might accept that this broad analysis leaves the Labour party in serious trouble, its traditional alliance between socialists and social liberals at an unhappy end. Like many failed marriages, it struggles on because each side fears the other will get control of the house. But for the good of the country, we need a party that represents the anger at what the City has done and freely continues to do to this country. Otherwise that anger will look for other places to express itself. And then, heaven help us, we will have our own Ms Le Pen.

Saturday 15 April 2017

Why rightwingers are desperate for Sweden to ‘fail’

Christian Christen in The Guardian

Of course Sweden isn’t perfect, but those who love to portray it as teeming with terrorists and naive towards reality, are just cynical hypocrites

‘When terrible events take place, they are framed as evidence of the decline and fall of the European social democratic project, the failure of European immigration policies and of Swedish innocence lost.’ Photograph: Fredrik Sandberg/AFP/Getty Images



There are few countries in the world that have “lost their innocence” as many times as Sweden. Even before a suspected terrorist and Isis supporter killed four and injured many more in last week’s attack in central Stockholm, Sweden’s policies were being portrayed on the programmes of Fox News and pages of the Daily Mail as, at best, exercises in well-meaning-but-naive multiculturalism, and at worst terrorist appeasement.
So, when terrible events take place, they are framed as evidence of the decline and fall of the European social democratic project, the failure of European immigration policies and of Swedish innocence lost.

When Donald Trump argued against the intake of Syrian refugees to the US earlier this year, he used supposed problems in Sweden as part of his rationale. “You look at what’s happening last night in Sweden,” the president said at a rally in Florida in February. “Sweden. Who would believe this? Sweden. They took in large numbers. They’re having problems like they never thought possible.” The White House later clarified that Trump had been speaking about general “rising crime”, when he seemed to be describing a then non-existent terror attack.


Sweden is a capitalist, economic power – usually found near the top of rankings of innovative and competitive economies


The obsession with Sweden has a lot to do with the country’s history of taking in refugees and asylum seekers, combined with social democratic politics. Both are poison to the political right. When prime minister Olof Palme was shot walking home (without bodyguards) from a cinema in 1986, we were told that Swedish innocence and utopian notions of a non-violent society had come to an end. But Swedes miraculously regained their innocence, only to lose it again in 2003 when the popular foreign minister Anna Lindh (also without bodyguards) was stabbed to death in a Stockholm department store. This possession and dispossession of innocence – which some call naivety – has ebbed and flowed with the years.

The election to parliament and subsequent rise of the anti-immigration Sweden Democrats were discussed in similar terms, as was the decision in late 2015 by the Swedish government to halt the intake of refugees after a decades-long policy of humanitarian acceptance.

Yet the notion of a doe-eyed Sweden buffeted by the cruel winds of the real world is a nonsense. Sweden is an economic power – usually found near the top of rankings of innovative and competitive economies. Companies that are household names, from H&M to Ericsson and Skype, and food packaging giant Tetra Pak, are Swedish. It plays the capitalist game better than most (and not always in an ethical manner. The country is, per capita, one of the largest weapons exporters in the world. As for the argument that Swedes are in denial, unwilling to discuss the impact of immigration? This comes as news to citizens who see the issue addressed regularly in the Swedish media, most obviously in the context of the rise of the Sweden Democrats.




Stockholm attack suspect 'known to security services'




Between 2014 and 2016, Sweden received roughly 240,000 asylum seekers: far and away the most refugees per capita in Europe. But the process has not been smooth. Throughout 2016 and 2017, the issue of men leaving Sweden to fight for Isis has been a major story, as has the Swedish government’s perceived lack of preparation about what to do when these fighters return. There is also much debate on the practice of gender segregation in some Muslim schools in Sweden.

As Stockholm goes through a period of mourning for last week’s attack, it is worth asking: is Sweden the country divorced from reality? If we are speaking of naivety in relation to terrorism, a good place to start might be US foreign policy in the Middle East , and not Sweden’s humanitarian intake of the immigrants and refugees created (at least in part) as a result of that US policy.

Has Swedish immigration policy always been well thought-out? No. Is Sweden marked by social and economic divisions? Yes. But the presentation of Sweden as some kind of case study in failed utopianism often comes from those who talk a big game on democracy, human rights and equality, but who refuse to move beyond talk into action.
So, when pundits and experts opine on Swedish “innocence lost” it is worth remembering that Sweden has never been innocent. It is also worth remembering that Sweden was willing to put its money where its mouth was when it came to taking in refugees and immigrants fleeing the conflicts and instability fuelled by countries unwilling to deal with the consequences of their actions. This shirking of responsibility while condemning the efforts of others is far worse than being naive. It’s cynical hypocrisy.