Search This Blog

Thursday, 20 May 2021

The secret of Johnson’s success lies in his break with Treasury dominance

Gordon Brown’s rule-based approach shaped Whitehall for two decades. But the Tories are forging a new politics that has little regard for prudence writes William Davies in The Guardian

 
Illustration: Eva Bee/The Guardian Thu 20 May 2021 07.00 BST

 

The Conservative party’s growing electoral dominance in non-metropolitan England, so starkly re-emphasised by results in the north-east, has been attributed to various causes. Brexit and the popularity of Boris Johnson both count for a great deal. But while Labour is busy telling voters how much it deserved to lose, this is only half the picture. A major part of Johnson’s appeal is the way he has escaped the shadow cast by one of Britain’s three most significant political figures of the past 45 years: not Margaret Thatcher or Tony Blair, but Gordon Brown. 

The 1994 meeting between Blair and Brown at the Granita restaurant in Islington, north London, shortly after John Smith’s death, is the founding myth of New Labour: the moment when Brown agreed to let Blair stand for the leadership, on certain conditions. In addition to Blair’s much disputed commitment to serve only two terms in office should he become prime minister, there was also his promise that Brown, as chancellor, would get control over the domestic policy agenda. At least the second of these commitments was honoured, resulting in a situation where, from 1997 to 2007, the Treasury held an overwhelming dominance over the rest of Whitehall, while Brown was implicitly unsackable.

But, together with his adviser Ed Balls, Brown was also the architect of a new apparatus of economic policymaking designed for the era of globalisation. The central problem that Balls and Brown confronted was how to build the capacity for higher levels of social spending, while also retaining financial credibility in an age of far more mobile capital than any confronted by previous Labour governments. The fear was that, with financial capital able to cross borders at speed, a high-spending government might be viewed suspiciously by investors and lenders, making it harder for the state to borrow cheaply. The first part of their answer endures to this day: operational independence was handed to the Bank of England, accompanied by an inflation target. No longer could politicians seek to win elections by cutting interest rates, a move that aimed to win the trust of the markets.

On top of this, Brown also introduced a culture of almost obsessive fiscal discipline, as if the bond markets would attack the moment he showed any flexibility – the same paranoia that shaped Clintonism. His “golden rule”, outlined in his first budget, stated that, over the economic cycle, the government could borrow only to invest, not for day-to-day spending. The Treasury governed the rest of Whitehall according to a strict economic rubric, demanding every spending proposal was audited according to orthodox neoclassical economics.

Balls later wrote that their thinking had been guided by an influential 1977 article, Rules Rather than Discretion, in which two economists, Finn Kydland and Edward Prescott, sought to demonstrate that policymakers will produce far better economic outcomes if they stick rigidly to certain principles and heuristics of policy, rather than seeking to intervene on a case-by-case basis. Brown’s robotic persona and his mantra of “prudence” conveyed a programme that was so focused on policy as to be oblivious to more frivolous aspects of politics.

Elements of this Brownite machine remained in place during the David Cameron-George Osborne years: a chancellor acting as a kind of parallel prime minister, transforming society through force of cost-benefit analysis, only now the fiscal tide was going out rather than in. Even “Spreadsheet Phil” Hammond sustained the template as far as he could, in the face of ever-rising attacks from the Brexit extremists in his own party. The point is that, from 1997 to 2019, the government largely meant the Treasury. Those powers that are so foundational for the modern nation state – to tax, borrow and spend – were the basis on which governments asked to be judged, by voters and financial markets.

Various things have happened to weaken the Treasury’s political authority over the past five years, though – significantly – none of these has yet seemed to weaken the government’s credibility in the eyes of the markets. First, there was the notorious cooked Brexit forecast published in May 2016, predicting an immediate recession, half a million job losses and a house price crash, should Britain vote to leave. The referendum itself, a mass refusal to view the world in terms of macroeconomics, meant there could be no going back to a world in which politics was dominated by economists.

Consider how different things are now from in Brown’s heyday. Johnson’s first chancellor, Sajid Javid, lasted little more than six months in the job, resigning after one of his aides was sacked by Dominic Cummings without his knowledge. His second, Rishi Sunak, may have high political ambitions and approval ratings, but scarcely forms the kind of double-act with Johnson that Brown did with Blair, or Osborne with Cameron. Johnson’s cabinet is notable for lacking any obvious next-in-line leader.

What’s more interesting are the parts of Whitehall that have suddenly risen in profile under Johnson: communities and local government under Robert Jenrick, and the Department for Digital, Culture Media and Sport under Oliver Dowden. With the “levelling up” agenda of the former, (manifest in such pork barrel politics as the Towns Fund) and the “culture war” agenda of the latter (evident in attacks on the autonomy of museums), a new vision of government is emerging, one that is no longer afraid of expressing cultural favouritism or fixing deals. Balls and Brown were inspired by “rules rather than discretion”; now there’s no better way to sum up Jenrick’s disgraceful governmental career to date than “discretion rather than rules”.

In the background, of course, are the unique fiscal and financial circumstances produced by Covid, in which all notions of prudence have been thrown out of the window. With the Bank of England buying most of the additional government bonds issued over the last 15 months (beyond the wildest imaginings of Balls and Brown), and with the cost of borrowing close to zero, the rationale for strict fiscal discipline or austerity has currently evaporated. Paradoxically, a situation in which the Treasury can find an emergency £60bn to pay the country’s wages makes for a popular chancellor, but may make for a less powerful Treasury.

Amid all this, Labour is left in an unenviable position, which is in many ways deeply unfair. So long as the Tories are associated with Brexit, England and Johnson, the voters don’t expect them to exercise any kind of discipline, fiscal or otherwise. Meanwhile, Labour remains associated with a Treasury worldview: technocratic, London-centric, British not English, rules not discretion. What’s doubly unfair is that, thanks to the serial fictions of Osborne and the Tory press from 2010 onwards that Labour had “spent all the money”, it is not even viewed as economically trustworthy. In the end, it turned out that public perceptions of financial credibility were largely shaped by political messaging and media narratives, not by adherence to self-imposed fiscal rules.

In the eyes of party members, New Labour will be for ever tarred by Blair and Iraq. In the eyes of much of the country, however, it will be tarred by some vague memory of centralised Brownite spending regimes. The fact that Labour receives so little credit for Brown’s undoubted successes as a spending chancellor is due to many factors, but ultimately consists in the fact that the technocratic, Treasury view of the world was never adequately translated into a political story. Osborne simply presented himself as the inheritor of a centralised “mess” that needed cleaning up.

The recent elections demonstrated that all political momentum is now with the cities and nations of Britain: the Conservatives in leave-voting England, Andy Burnham in Manchester, the SNP in Scotland, Labour in Wales. Rather than making weak gestures towards the union jack or against London, Labour needs to think deeply about the kind of statecraft and policy style that is suited to such a moment, so as to finally leave the world of Granita and “golden rules” behind.

Monday, 17 May 2021

What is Halala?

 












Why the suspicion on China’s Wuhan lab virus is growing

 Tara Kartha in The Print


Members of the World Health Organization (WHO) team at the Wuhan Institute of Virology | Photographer: Hector Retamal/AFP/Getty Images via Bloomberg
 

It’s been nearly eighteen months since the coronavirus brought the world on its knees, with India in the middle of a deadly second wave that is claiming 4,000 lives daily on an average. No one can tell when this will end. But it is possible to probe how this catastrophe began, and China’s role in it. Fortunately, even as cover ups go on. Several reports are out in the public domain and anybody who isn’t afraid of speaking the truth should be able to connect the dots.

One report out is that of the Independent Panel, set up by a resolution of the 73rd World Health Assembly. The specific mission of the committee was to review the response of the World Health Organization (WHO) to the Covid outbreak and the timelines relevant. In other words, it was never meant to be an inquisition on China. And it wasn’t. Not by a long chalk. It went around the core question of the origin of the virus, even while indulging in what seems to be pure speculation. Then there are two recent publications investigating the origin of the virus, which are worthy of note. Neither are written by sage scientists, but by analysts viewing the whole sequence of events through the prism of intelligence. Which means that these efforts skip the big words, and get to the facts. Collate all these different sources, add a little more of the background colour, and you start to get the big picture. 

Is this biological warfare?

The need to find out the truth becomes urgent as the situation worsens, for instance with dangerously high death rates in Aligarh Muslim University, where there is now speculation whether the deaths could be linked to a separate strain. There are arguments that India’s second wave could be a deliberate one, especially since the ‘double mutant’ has not hit any of its neighbours. Such speculation is likely to rise, given that China has now effectively closed any possibility of withdrawal from Ladakh, and the Chinese economy goes from strength to strength, growing a record 18.3 per cent in the first quarter of the new financial year. Unsurprisingly, even world leaders, like Brazil’s President Jair Bolsonaro, have linked the pandemic to biological warfare.

Arising from this is the biggest potential danger: someone may decide to respond in kind in a bid to fix Beijing. That’s how intelligence operations work. After all, major countries haven’t been funding their top secret labs for nothing. In any scenario, there’s some serious trouble ahead, especially since the Narendra Modi government seems to be more intent on playing down the crisis than addressing it. 

The Independent Panel 

The panel’s mandate is set out clearly in the May 2020 resolution, which calls for “a stepwise process of impartial, independent and comprehensive evaluation…to review experience gained and lessons learned from the WHO-coordinated international health response to COVID-19…” and thereafter provide recommendations. This the panel undoubtedly did.

The 13-member panel included former Prime Minister of New Zealand Helen Clark, former President of Liberia who has core expertise in setting up health care after Ebola, an award winning feminist, a former WHO bureaucrat, and a former Indian health secretary, who had six months experience in handling the first Covid wave, was on innumerable panels on health, and in the manner of civil servants in this country, also did a stint in the Ministry of Defence, not to mention the World Bank. The Indian representative certainly gives the whole exercise the imprimatur of legality, given hostile India-China relations. And finally, Zhong Nanshan who was advisor to the Chinese government during the Wuhan outbreak, and who received the highest State honour of the Medal of the Republic from his President—Xinhua describes him as a “brave and outspoken” doctor.

The WHO also lists Peng Liyuan as a Goodwill Ambassador, describing her as a famous opera singer. She is the wife of President Xi Jinping. It’s,therefore, entirely unsurprising that while the panel diligently shows WHO the sources of early warning, it makes a vague case on the origins of the virus, noting that while a species of bat was “probably” the host, the intermediate cycle is unknown. Most astonishingly, the committee states that the virus “may already have been in circulation outside China in the last months of 2019”. No evidence for that either. The overall tenor of the report is that it would take years to sort all this out.

There is only one paragraph of note from the point of view of those seeking the truth.

The panel’s report states that less than 55–60 per cent of early cases had been exposed to the wet markets, and that the area merely “amplified” the virus. In other words, the market, with its hundreds of exotic wildlife, could not have been the source. It, however, carefully notes that the Wuhan Institute of Virology (WIV) sequenced the entire genome of the virus almost within weeks, and later provided this to a public access source. The report praises the diligence of clinicians who managed to isolate the virus within a short time. That’s wonderful all right. No question. But it doesn’t at all address the question whether those diligent researchers were also experimenting on the virus. 

That troubling question

These assessments by the Independent Panel are now, however, being questioned, leading to bits of intelligence being pieced together from within a country that would put the term ‘Iron curtain’ to shame. An earlier WHO study on the virus’ origin was roundly condemned by a group of countries,including the US, Australia, Canada and others (not India) as being duplicitous in the extreme.

In January 2021, the US Department of State released a Fact Sheet on activity of WIV, which is entirely based on intelligence. That factsheet is damning, indicating that several researchers at the institute had fallen ill with characteristics of the Covid virus, thus showing up senior Chinese researcher Shi Zhengli’s claim that there was “zero infection” in the lab. The lab was the centre of research of the SARS virus since its first outbreak, including on ‘RaTG13’ virus found in bats, and which is 96 per cent similar to the present virus SARS-COV-2. Worst, it also pointed out that “the United States has determined that the WIV has collaborated on publications and secret projects with China’s military. The WIV has engaged in classified research, including laboratory animal experiments, on behalf of the Chinese military since at least 2017”.

That’s intelligence. Now for the analysis — the two recent publications probing the virus’ origin. 

Disaggregating the facts 

One analytical article is published in the prestigious Bulletin of the Atomic Scientists. Another is a paper by the equally reputed Begin Sadat Centre for Strategic Studies. Both are a careful collation of facts, and establish the following.

The paper by Begin Sadat Centre brings out additional information that bolsters the US Fact Sheet. It appears that the US had been able to get a ‘source’ from WIV directly, and that another Chinese scientist had defected to an unknown European country. That led directly to information on the military side of the programme. The study also quotes David Asher, who led the Department of State investigation. Asher observes that the WIV had two campuses, not one, as popularly believed. This was known by the Indian authorities for years, but does not seem to have been put about. Asher also adds that all mention of the SARS virus was dropped from the institute’s publicly admitted biological “defence programmes” by 2017 at the same time when the Level 4 lab kicked off operations.

Even more surprising was that an adjacent facility had already administered vaccines to its senior faculty in March 2020 itself. That doesn’t suggest an accident. That suggests a program that was designed to kill, and for which vaccines were already under research. Then there damning studies stated: “There are plenty of indications in the sequence itself that [the initial pandemic virus] may have been synthetically altered. It has the backbone of a bat [coronavirus], combined with a pangolin receptor binder, combined with some sort of humanized mice transceptor. These things don’t actually make sense (and) …..the odds this could be natural are very low… [but this is attainable] through deliberate scientific ‘gain of function research” that was going on at the WIV.

There is no doubt that ‘gain of function’ research is practised in biological research labs world over, resulting in, sometimes, dangerous incidents. This type of research involves in-crossing viruses, ostensibly to gain knowledge on how to battle the disease from within. In these cases, it’s almost impossible to decide where the ‘defence’ aspect leaches into an offensive capability. That these findings were from US scientists who were ‘fearful’ of being quoted shows not just the extent of Beijing’s clout in university research and funding, but also a high degree of restraint. Biological research is almost never talked about. 

The denials begin

Biological research and the secrecy around it is the aspect of focus in Nicholas Wade’s article published in Bulletin of the Atomic Scientists. As he writes, from the beginning, there was denial at the highest levels from some unexpected quarters. The first was in The Lancet— one of the oldest journals of medical research—by a group of authors in March 2020, when the pandemic had just broken out. Even to a layman, it would have seemed that it was far too early for the group of authors to contemptuously dismiss ‘conspiracy theories’ that the virus was not of a natural origin.

It turns out that The Lancet letter was drafted by Peter Daszak, President of the EcoHealth Alliance of New York, who’s organisation funded corona virus research at the Wuhan lab. As is pointed out in Wade’s article, any revelation of such a connection would have been criminal to say the least, if it was proved that the virus did escape from the lab. Unsurprisingly, Daszak was also part of the WHO team investigating the origins of the virus.

Another burst of outrage came from a group of professors who also hurried to disprove, in an article, the ‘lab created’ theory on the grounds – simply put – that it was not of the most probably calculated design. The lead author Kristian G Anderson is from the Scripps Research Institute, La Jolla, which specialises in biomedical research. It also has partnerships with Chinese labs and pharma companies. None of that is criminal. Especially when Scrippsis already in financial distress at the time.  Besides, such collaborations are not restricted to just US labs. See, for instance, an account of Australian doctor Dominic Dwyer, who was part of the first WHO study, and who dismissed without any evidence presented that the virus had leaked from a lab.

Dwyer’s claim that the Wuhan lab seems to have been run well, and that nobody from the facility seemed to have fallen sick has now been disputed. Evidence of a dangerous virus escaping a lab – as it has in the past on what he calls “rare” occasions – would mean a death blow to labs everywhere. Funding is, after all, hard to come by. Then there is the nice hard cash involved. The Harvard professor Dr Charles Leiber who was arrested, together with two other Chinese, for collaborating quietly with the Wuhan University of Technology (WUT), was being paid roughly $50,000 per month, living expenses of up to 1,000,000 Chinese Yuan (approximately $158,000) and awarded $1.5 million to establish a research lab at WUT.  He was also asked to ‘cultivate’ young teachers and Ph.D. students by organising international conferences.

It’s all very pally and friendly, and a lot of money is involved. The end result? A virus out of hell, that seems not to affect the Chinese as its economy powers ahead and shifts its weight more comfortably into its rising position in the global order.

How to avoid the return of office cliques

Some managers are wary of telling staff that going into a workplace has networking benefits writes Emma Jacobs in The FT

After weighing up the pros and cons of future working patterns, Dropbox decided against the hybrid model — when the working week is split between the office and home. “It has some pretty significant drawbacks,” says Melanie Collins, chief people officer. Uppermost is that it “could lead to issues with inclusion, or disparities with respect to performance or career trajectory”. In the end, the cloud storage and collaboration platform opted for a virtual-first policy, which prioritises remote work over the office. 

As offices open, there are fears that if hybrid is mismanaged, organisational power will revert to the workplace with executives forming in-office cliques and those employees who seek promotion and networking opportunities switching back to face time with senior staff as a way to advance their careers.

The office pecking order 

Status-conscious workers may be itching to return to the office, says Tomas Chamorro-Premuzic, professor of business psychology at Columbia University and UCL. “Humans are hierarchical by nature, and the office always conveyed status and hierarchy — car parking spots, cars, corner office, size, windows. The risk now is that, in a fully hybrid and flexible world, status ends up positively correlated with the number of days at the office.” 

This could create a two-tier workforce: those who want flexibility to work from home — notably those with caring responsibilities — and those who gravitate towards the office. Rosie Campbell, professor of politics and director of the Global Institute for Women’s Leadership at King’s College London, says that past research has shown that “part-time or remote workers tend not to get promoted”. This has been described as the “flexibility stigma, defined as the “discrimination and negative perception towards workers who work flexibly, and [consequent] negative career outcomes”. 

Research by Heejung Chung, reader in sociology and social policy at Kent University, carried out before the pandemic, found that “women, especially mothers (of children below 12) [were] likely to have experienced some sort of negative career consequence due to flexible working”. Lockdowns disproportionately increased caring responsibilities for women, through home-schooling and closure of childcare facilities. 

Missing out on career development 

Some companies are creating regional hubs or leasing local co-working spaces so that workers can go to offices closer to home, reducing commute times and the costs of expensive office space. Lloyds Banking Group is among a number of banks, for example, that have said they will use surplus space in their branches for meetings. The risk, Campbell says, is workers using local offices miss out on exposure to senior leaders and larger networks that might advance their careers. “People might say it’s easier to be at home or use suburban hubs but it might actually be better to go into the office. Regional or suburban hubs are giving you a place to work that isn’t at home but isn’t giving you any of the face time.” 

Employers and team leaders may need to be explicit about the purpose of the office: not only is it a good place for collaborating with teams and serendipitous conversations but also for networking.  
 
Mark Mortensen, associate professor of organisational behaviour at Insead, points out it is difficult — and paternalistic — as a manager to suggest an employee spends more time in the office to boost their career. A recent opinion article by Cathy Merrill, chief executive of Washingtonian Media, in the Washington Post, sparked a huge backlash on social media and more importantly, her employees, for arguing that those who do not return to the office might find themselves out of a job. “The hardest people to let go are the ones you know,” she wrote. 

Her staff felt their remote work had been unappreciated and were angry that they had not been consulted over future work plans — so they went on strike. 

Mortensen does not advise presenting staff with job loss threats, but puts forward a case for frank and open conversations about the value of time in the office. “Informal networks aren’t just nice to have, they are important. We need to tell people the risk is if you are working remotely you will be missing out on something that might prove beneficial in your career. It’s tough. People will say they sell things on their skills but you have to be honest and say that relationships are important. Weak ties can be the most critical in shaping people’s career paths.” 

The problem is that after dealing with a pandemic and lockdowns, workers may not be in the best place to know what they want out of future work patterns. Chamorro-Premuzic says that he fears that even people who are enjoying it right now, may not realise “they are burnt out. It’s like the introvert who likes working from home, they’re playing to their strength — staying in their own comfort zone.” 

Examine workplace culture 

As employers try to configure ways of working they need to scrutinise workplace culture and find out why employees might prefer to be at home. Some will have always felt excluded from networks and sponsorship in the office — and being away from it means that they do not have to think about it. 

Future Forum, Slack’s future of work think-tank, found that black knowledge workers were more likely to prefer a hybrid or remote work model because the office was a frequent reminder “of their outsider status in both subtle (microaggressions) and not-so-subtle (overt discrimination) ways”. It said the solution was not to give “black employees the ability to work from home, while white executives return to old habits [but] about fundamentally changing your own ways of working and holding people accountable for driving inclusivity in your workplace”. 

Some experts believe that the pandemic has fundamentally altered workplace behaviour. Tsedal Neeley, professor of business administration at Harvard Business School and author of Remote Work Revolution, is optimistic. “Individuals are worried about their career trajectory because the paranoia is, ‘If we don’t go to the office will we get the same opportunities and career mobility if we’re not physically in the office?’ These would be very legitimate worries 13 months ago but less of a concern now.” 

Chung co-authored a report by Birmingham University that found more fathers taking on caring responsibilities and an increase in the “number of couples who indicate that they have shared housework [and] care activities during lockdown”. This might shift couples’ attitudes to splitting work and home duties and alter employers’ stigmatisation of flexible working. 

Prevent an in-crowd 

There are some measures that employers can take to try to prevent office cliques forming. Some workplaces will require teams to come in on the same days so employees get access to their manager, rather than leaving it to individuals to arrange their own office schedules. Though this would mean team members might not get access to senior leaders or form ties with other teams that they might have done when the office was the default. 

Lauren Pasquarella Daley, senior director of women and the future of work at Catalyst, a non-profit that advocates for women at work, says senior executives need to be “intentional about sponsorship and mentoring” rather than letting these relationships form by chance. 

They must also be role models for flexible working. “If employees don’t feel it’s OK to take advantage of remote work then they won’t do so.” This means ensuring meetings are documented. If, for example, one person is working outside the office then everyone needs to act as if they are remote, too. 

Chamorro-Premuzic says managers should work on the assumption that in-office cliques will form. This means organisations need to put in place better measures of objectives, performance measures independent of where people are, as well as measuring and monitoring bias (for example, if you know how often people come to work, you can test whether there is a correlation between being at work and getting a positive performance review, which would suggest bias or adverse impact), and training leaders and managers on how to be inclusive. 

“We may not have tonnes of data on the disparate impact of hybrid policies on underprivileged groups, but it is naive to assume it won’t happen. The big question is how to mitigate it,” he says.

Sunday, 16 May 2021

Islamophobia And Secularism

Nadeem F Paracha in The Dawn

Prime Minister Imran Khan frequently uses the term ‘Islamophobia’ while commenting on the relationship between European governments and their Muslim citizens. Khan has often been accused of lamenting the treatment meted out to Muslims in Europe, but remaining conspicuously silent about cases of religious discrimination in his own country.

Then there is also the case of Khan not uttering a single word about the Chinese government’s apparently atrocious treatment of the Muslim population of China’s Xinjiang province.

Certain laws in European countries are sweepingly described as being ‘Islamophobic’ by Khan. When European governments retaliate by accusing Pakistan of constitutionally encouraging acts of bigotry against non-Muslim groups, the PM bemoans that Europeans do not understand the complexities of Pakistan’s ‘Islamic’ laws.

Yet, despite the PM repeatedly claiming to know the West like no other Pakistani does, he seems to have no clue about the complexities of European secularism.

Take France for instance. French secularism, called ‘Laïcité’ is somewhat different than the secularism of various other European countries and the US. According to the contemporary scholar of Western secularism, Charles Taylor, French secularism is required to play a more aggressive role.

In his book, A Secular Age, Taylor demonstrates that even though the source of Western secularism was common — i.e. the emergence of ‘modernity’ and its political, economic and social manifestations — secularism evolved in Europe and the US in varying degrees and of different types. 

Secularism in the US remains largely impersonal towards religion. But in France and in some other European countries, it encourages the state/government to proactively discourage even certain cultural dimensions of faith in the public sphere which, it believes, have the potential of mutating into becoming political expressions.

Nevertheless, to almost all prominent philosophers of Western democracy across the 19th and 20th centuries, the idea of providing freedom to practise religion is inherent in secularism, as long as this freedom is not used for any political purposes.

According to the American sociologist Jacques Berlinerblau in A Call to Arms for Religious Freedom, six types of secularism have evolved. The American researcher Barry Kosmin divides secularism into two categories: ‘soft’ and ‘hard’. Most of Berlinerblau’s types fall in the ‘soft’ category. The hard one is ‘State Sponsored Atheism’ which looks to completely eliminate religion. This type was practised in various former communist countries and is presently exercised in China and North Korea. One can thus place Laïcité between Kosmin’s soft and hard secular types.

The existence of what is called ‘Islamophobia’ in secular Europe and the US has increasingly drawn criticism from various quarters. According to the French author Jean-Loïc Le Quellec, the term is derived from the French word ‘islamophobie’ that was first used in 1910 to describe prejudice against Muslims.

L.P. Sheridan writes in the March 2006 issue of the Journal of Interpersonal Violence that the term did not become widely used till 1991. According to Roland Imhoff and Julia Recker in the Journal of Political Psychology, a wariness had already been building in the West towards Muslims because of the aggressively anti-West ‘Islamic’ Revolution in Iran in 1979, and the violent backlash in some Muslim countries against the publication of the novel Satanic Verses by the British author Salman Rushdie in 1988.

Islamophobia is one of the many expressions of racism towards ‘the other’. Racisms of varying nature have for long been present in Europe and the US. Therefore Imhoff and Recker see Islamophibia as “new wine in an old bottle.” It is a relatively new term, but one that has also been criticised.

Discrimination against race, faith, ethnicity, caste, etc., is present in almost all countries. But its existence gets magnified when it is present in countries that describe themselves as liberal democracies.

Whereas Islamophobia is often understood as a phobia against Islam, there are also those who find this definition problematic. To the term’s most vehement critics, not only has it overshadowed other aspects of racism, of which there are many, it is also mostly used by ‘radical Muslims’ to curb open debate.

In a study, the University of Northampton’s Paul Jackson writes that the term should be replaced with ‘Muslimphobia’ because the racism in this context is aimed at a people and not towards the faith, as such. However, he does add that the faith too should be open for academic debate.

In an essay for the 2016 anthology The Search for Europe, Bichara Khader writes that racism against non-white migrants in Europe intensified in the 1970s because of a severe economic crisis. Khader writes that this racism was not pitched against one’s faith.

According to Khader, whereas this meant that South Asian, Arab, African and Caribbean migrants were treated as an unwanted whole based on the colour of their skin, from the 1980s onwards, the Muslims among these migrants began to prominently assert their distinctiveness. As the presence of veiled women and mosques grew, this is when the ‘migration problem’ began to be seen as a ‘Muslim problem’.

The Muslim diaspora in the West began to increasingly consolidate itself as a separate whole. Mainly through dress, Muslim migrants began to shed the identity of their original countries, creating a sort of universality of Muslimness.

But this also separated them from the non-Muslim migrant communities, who were facing racial discrimination as well. Interestingly, this imagined universality of Muslimness was also exported back to the mother countries of Muslim migrants.

Take the example of how, in Pakistan, some recent textbooks have visually depicted the dress choices of Pakistani women. They are almost exactly how some second and third generation Muslim women in the West imagine a woman should dress like.

But there was criticism within Pakistan of this depiction. The critics maintain that the present government was trying to engineer a cultural type of how women ought to dress in a country where — unlike in some other Muslim countries — veiling is neither mandatory nor banned. This has only further highlighted the fact that identity politics in this context in Pakistan is being influenced by the identity politics being flexed by certain Muslim groups in the West.

Either way, because of the fact that it is a recent phenomenon, identity politics of this nature is not organic as such, and will continue to cause problems for Muslims within and away.