Larry Elliot in The Guardian
Three times a week an update on new Covid-19 cases is published by the economics consultancy Pantheon. Vaccination rates are monitored by the Swiss bank UBS. The scientists advising the government are in regular contact with the Bank of England’s monetary policy committee – the body that sets interest rates.
Richard Nixon may or may not have said “we are all Keynesians now” after the US broke its link with gold in 1971 but one thing is for sure: all economists are epidemiologists now. And there’s a downside and an upside to that.
The downside is that economic forecasting is currently even more of a mug’s game than usual because even the real (as opposed to the amateur) epidemiologists don’t really know what is going to happen next. Are there going to be new mutations of the virus? Assuming there are, will they be less susceptible to vaccines? Will Covid-19 go away in the summer only to return again as the days get shorter, as happened last year? Nobody really knows the answers to those questions.
The upside is that the pandemic has forced economists to look beyond their mechanical models and embrace thinking from other disciplines, of which epidemiology is just one.
For a start, it is hard to estimate how people are going to react to the easing of lockdown restrictions without some help from psychologists. It is possible that there will be an explosion of spending as consumers, in the words of Andrew Bailey, “go for it”, but it is also possible that the second wave of infection will make them a lot more cautious than they were last summer, when there was still hope that Covid-19 was a fleeting phenomenon.
An individual’s behaviour is also not entirely driven by their own economic circumstances. It can be strongly affected by what others are doing. If your peer group decides after having the vaccine that it is safe to go to the pub, that will probably affect your decision about whether to join your mates for a drink, even if you are slightly nervous. Sociology has a part to play in economic forecasting.
As does history, if only to a limited extent, because there are not a lot of comparable episodes to draw upon. A century has passed since the last truly global pandemic and there is only so much that can be learned from the outbreak of Spanish flu after the first world war. But when Andy Haldane, the chief economist of the Bank of England, says the economy is like a coiled spring waiting to be unleashed, that’s because he thinks there are lessons to be learned from the rapid recovery seen last summer. Back then, the economy followed a near 19% collapse in the second quarter of 2020 with a 16% jump in the third quarter.
Naturally, economics has a part to play in judging what happens next. Millions of people (mostly the better off) have remained in work on full pay for the past year but have struggled to find anything to spend their money on. Millions of others – those furloughed on 80% of their normal wages or self-employed people who have slipped through the Treasury’s safety net – are less well-off than they were a year ago and may fear for their job prospects.
In an ideal world, the better-off would decide that the amount of money saved during lockdown was far in excess of what they needed and would then go on a spending spree: heading out for meals, taking weekend breaks, buying new cars; having their homes re-decorated. That would provide jobs and incomes for those on lower incomes.
But it might not work out like that. If the better-off leave their accumulated savings (or most of them, at least) in the bank, that means higher unemployment for those working in consumer-facing services jobs – such as hotels and restaurants – and an economy with a dose of long Covid.
There are two conclusions to be drawn from all of this. The first is that precise forecasts of what is going to happen to the economy over the next year, or even the next few months, should be treated with caution. Assuming the vaccination programme continues to go well, assuming that there are no further waves of infection, assuming restrictions are lifted steadily from early March onwards, and assuming that people come out of hibernation rapidly and in numbers, then the economy will start to recover in the second quarter. But there are a heck of a lot of assumptions in there: it might take until the third quarter for the bounce back to begin; the recovery might prove weaker or stronger than the consensus currently expects.
The second conclusion is equally obvious. If, as is clearly the case, the existence of so many imponderables makes precision forecasting more difficult than normal, it makes sense for economic policy makers to act with caution. For the Bank of England, that means no dash to embrace negative interest rates, which won’t be necessary if Haldane’s bullishness proves to be justified; and for the Treasury it means extending financial support and ignoring calls for higher taxes, especially those that might lead businesses to collapse or cut back on investment.
It would appear that Rishi Sunak has reached the same conclusion. There has been far less talk from the chancellor recently about the need to reduce the UK’s budget deficit, a process that has now been delayed until the second budget of 2021 in the autumn. By that stage, it might well once again by Sunak rather than the epidemiologists running the economy. Well, perhaps.
'People will forgive you for being wrong, but they will never forgive you for being right - especially if events prove you right while proving them wrong.' Thomas Sowell
Search This Blog
Sunday, 14 February 2021
Distorting Narratives for Nation States
Nadeem F Paracha in The Dawn
In the last few years, my research, in areas such as religious extremism, historical distortions in school textbooks, the culture of conspiracy theories and reactionary attitudes towards science, has produced findings that are a lot more universal than one suspected.
This century’s second decade (2010-2020) saw some startling political and social tendencies in Europe and the US, which mirrored those in developing countries. Before the mentioned decade, these tendencies had been repeatedly commented upon in the West, as if they were entirely specific to poorer regions. Even though many Western historians, while discussing the presence of religious extremism, superstition or political upheavals in developing countries, agreed that these existed in developed countries as well, they insisted that these were only largely present in their own countries during their teething years.
In 2012, at a conference in London, I heard a British political scientist and an American historian emphasise that the problems that developing countries face — i.e. religiously-motivated violence, a suspect disposition towards science, and continual political disruption — were present at one time in developed countries as well, but had been overcome through an evolutionary process, and by the construction of political and economic systems that were self-correcting in times of crisis. What these two gentlemen were suggesting was that most developing countries were still at a stage that the developed countries had been two hundred years or so ago.
However, eight years after that conference, Europe and the US, it seems, have been flung back two hundred years in the past. Mainstream political structures there have been invaded by firebrand right-wing populists, dogmatic ‘cultural warriors’ from the left and the right are battling it out to define what is ‘good’ and what is ‘evil’ — in the process, wrecking the carefully constructed pillars of the Enlightenment era on which their nations’ whole existential meaning rests — the most outlandish conspiracy theories have migrated from the edges of the lunatic fringe into the mainstream, and science is being perceived as a demonic force out to destroy faith.
Take for instance, the practice of authoring distorted textbooks. Over the years, some excellent research cropped up in Pakistan and India that systematically exposed how historical distortions and religious biases in textbooks have contributed (and still are contributing) to episodes of bigotry in both the countries. During my own research in this area, I began to notice that this problem was not restricted to developing countries alone.
In 1971, a joint study by a group of American and British historians showed that out of the 36 British and American school textbooks that they examined, no less than 25 contained inaccurate information and ideological bias. In 2007, the American sociologist James W. Loewen surveyed 18 American history texts and found them to be “marred by an embarrassing combination of blind patriotism, sheer misinformation, and outright lies.” He published his findings in the aptly titled book Lies My Teacher Told Me.
In 2020, 181 historians in the UK wrote an open letter demanding changes to the history section of the British Home Office’s citizenship test. The campaign was initiated by the British professor of history and archeology Frank Trentmann. A debate on the issue, through an exchange of letters between Trentmann and Stephen Parkinson, a former Home Office special adviser, was published in the August 23, 2020 issue of The Spectator. Trentmann laments that the problem lay in a combination of errors, omissions and distortions in the history section pages, which were also littered with mistakes.
Not only are historical distortions in textbooks a universal practice, but the many ways that this is done are equally universal and cut across competing ideologies. In Textbooks as Propaganda, the historian Joanna Wojdon demonstrates the methods that were used by the state in this respect in communist Poland (1944-1989).
The methods of distortions in this case were similar to the ones that were used in various former communist dictatorships such the Soviet Union and its satellite states in East Europe, and in China. The same methods in this context were also employed by totalitarian regimes in Nazi Germany, and in fascist Italy and Spain.
And if one examines the methods of distorting history textbooks, as examined by Loewen in the US and Trentmann in the UK, one can come across various similarities between how it is done in liberal democracies and how it was done in totalitarian set-ups.
I once shared this observation with an American academic in 2018. He somewhat agreed but argued that because of the Cold War (1945-1991) many democratic countries were pressed to adopt certain propaganda techniques that were originally devised by communist regimes. I tend to disagree. Because if this were a reason, then how is one to explain the publication of the book The Menace of Nationalism in Education by Jonathan French Scott in 1926 — almost 20 years before the start of the Cold War?
Scott meticulously examined history textbooks being taught in France, Germany, Britain and the US in the 1920s. It is fascinating to see how the methods used to write textbooks, described by Scott as tools of indoctrination, are quite similar to those applied in communist and fascist dictatorships, and how they are being employed in both developing as well as developed countries.
In a nutshell, no matter what ideological bent is being welded into textbooks in various countries, it has always been about altering history through engineered stories as a means of promoting particular agendas. This is done by concocting events that did not happen, altering those that did take place, or omitting events altogether.
It was Scott who most clearly understood this as a problem that is inherent in the whole idea of the nation state, which is largely constructed by clubbing people together as ‘nations’, not only within physical but also ideological boundaries.
This leaves nation states always feeling vulnerable and fearing that the glue that binds a nation together, through largely fabricated and manufactured ideas of ethnic, religious or racial homogeneity, will wear off. Thus the need is felt to keep it intact through continuous historical distortions.
In the last few years, my research, in areas such as religious extremism, historical distortions in school textbooks, the culture of conspiracy theories and reactionary attitudes towards science, has produced findings that are a lot more universal than one suspected.
This century’s second decade (2010-2020) saw some startling political and social tendencies in Europe and the US, which mirrored those in developing countries. Before the mentioned decade, these tendencies had been repeatedly commented upon in the West, as if they were entirely specific to poorer regions. Even though many Western historians, while discussing the presence of religious extremism, superstition or political upheavals in developing countries, agreed that these existed in developed countries as well, they insisted that these were only largely present in their own countries during their teething years.
In 2012, at a conference in London, I heard a British political scientist and an American historian emphasise that the problems that developing countries face — i.e. religiously-motivated violence, a suspect disposition towards science, and continual political disruption — were present at one time in developed countries as well, but had been overcome through an evolutionary process, and by the construction of political and economic systems that were self-correcting in times of crisis. What these two gentlemen were suggesting was that most developing countries were still at a stage that the developed countries had been two hundred years or so ago.
However, eight years after that conference, Europe and the US, it seems, have been flung back two hundred years in the past. Mainstream political structures there have been invaded by firebrand right-wing populists, dogmatic ‘cultural warriors’ from the left and the right are battling it out to define what is ‘good’ and what is ‘evil’ — in the process, wrecking the carefully constructed pillars of the Enlightenment era on which their nations’ whole existential meaning rests — the most outlandish conspiracy theories have migrated from the edges of the lunatic fringe into the mainstream, and science is being perceived as a demonic force out to destroy faith.
Take for instance, the practice of authoring distorted textbooks. Over the years, some excellent research cropped up in Pakistan and India that systematically exposed how historical distortions and religious biases in textbooks have contributed (and still are contributing) to episodes of bigotry in both the countries. During my own research in this area, I began to notice that this problem was not restricted to developing countries alone.
In 1971, a joint study by a group of American and British historians showed that out of the 36 British and American school textbooks that they examined, no less than 25 contained inaccurate information and ideological bias. In 2007, the American sociologist James W. Loewen surveyed 18 American history texts and found them to be “marred by an embarrassing combination of blind patriotism, sheer misinformation, and outright lies.” He published his findings in the aptly titled book Lies My Teacher Told Me.
In 2020, 181 historians in the UK wrote an open letter demanding changes to the history section of the British Home Office’s citizenship test. The campaign was initiated by the British professor of history and archeology Frank Trentmann. A debate on the issue, through an exchange of letters between Trentmann and Stephen Parkinson, a former Home Office special adviser, was published in the August 23, 2020 issue of The Spectator. Trentmann laments that the problem lay in a combination of errors, omissions and distortions in the history section pages, which were also littered with mistakes.
Not only are historical distortions in textbooks a universal practice, but the many ways that this is done are equally universal and cut across competing ideologies. In Textbooks as Propaganda, the historian Joanna Wojdon demonstrates the methods that were used by the state in this respect in communist Poland (1944-1989).
The methods of distortions in this case were similar to the ones that were used in various former communist dictatorships such the Soviet Union and its satellite states in East Europe, and in China. The same methods in this context were also employed by totalitarian regimes in Nazi Germany, and in fascist Italy and Spain.
And if one examines the methods of distorting history textbooks, as examined by Loewen in the US and Trentmann in the UK, one can come across various similarities between how it is done in liberal democracies and how it was done in totalitarian set-ups.
I once shared this observation with an American academic in 2018. He somewhat agreed but argued that because of the Cold War (1945-1991) many democratic countries were pressed to adopt certain propaganda techniques that were originally devised by communist regimes. I tend to disagree. Because if this were a reason, then how is one to explain the publication of the book The Menace of Nationalism in Education by Jonathan French Scott in 1926 — almost 20 years before the start of the Cold War?
Scott meticulously examined history textbooks being taught in France, Germany, Britain and the US in the 1920s. It is fascinating to see how the methods used to write textbooks, described by Scott as tools of indoctrination, are quite similar to those applied in communist and fascist dictatorships, and how they are being employed in both developing as well as developed countries.
In a nutshell, no matter what ideological bent is being welded into textbooks in various countries, it has always been about altering history through engineered stories as a means of promoting particular agendas. This is done by concocting events that did not happen, altering those that did take place, or omitting events altogether.
It was Scott who most clearly understood this as a problem that is inherent in the whole idea of the nation state, which is largely constructed by clubbing people together as ‘nations’, not only within physical but also ideological boundaries.
This leaves nation states always feeling vulnerable and fearing that the glue that binds a nation together, through largely fabricated and manufactured ideas of ethnic, religious or racial homogeneity, will wear off. Thus the need is felt to keep it intact through continuous historical distortions.
Monday, 8 February 2021
The biggest lesson of GameStop
Rana Foroohar in The FT
Much has been written about whether the GameStop trading fiasco is the result of illegal flash mobs or righteous retail investors storming a rigged financial system. Robinhood’s decision to block its retail customers from purchasing the stock while hedge funds continued trading elsewhere has turned the event into a David and Goliath story.
But that story is predicated on a false idea, which is that markets that have been “democratised” and that people trading on their phones somehow represent a more inclusive capitalism.
They do not. Markets and democracy are not the same thing, although most politicians — Democrats and Republicans — have acted since the 1980s as if they were. That period was marked by market deregulation, greater central bank intervention to smooth out the business cycle via monetary policy following the end of the Bretton Woods exchange rate system, and the rise of shareholder capitalism. This combined to begin moving the American economy from one in which prosperity was based on secure employment and income growth, to one in which companies and many consumers focused increasingly on ever-rising asset prices as the most important measure of economic health.
Right now, short-term fiscal stimulus aimed at easing the economic pain from Covid-19 is distorting the picture. But putting that aside, the US economy is at a point where capital gains and distributions from individual retirement accounts make up such a large proportion of personal consumption expenditure that it would be difficult for growth to continue if there were a major correction in asset prices.
That is one reason why the GameStop story has so unnerved people. It reminds Americans how incredibly dependent we all are on markets that can be very, very volatile.
The 40-year shift towards what President George W Bush referred to as an “ownership society” came at a time when the nature of the corporation and the compact between business and society was changing, too. The two phenomenon are of course not unrelated.
The transformation of markets put more short-term pressure on companies, which cut costs by outsourcing, automating, using less union labour and dumping defined benefit pensions for 401k plans, which put responsibility for choosing investments, and the risks of bad outcomes, on individual workers. In 1989, 31 per cent of American families held stock. Today it is nearly half. Now, it seems, we are all day traders. My 14-year-old recently told me I should “buy the dip,” which did nothing to quell my fears that we are in the midst of an epic bubble.
GameStop is the perfect reflection of all of this. The ultimately unsuccessful effort to squeeze short-sellers by pushing up the share price illustrates the risks of the markets. At the same time, the company itself illustrates how the nature of employment has changed. In a 2015 Brookings paper, University of Michigan sociologist and management professor Jerry Davis tracked the job growth linked to every initial public offering from 2000 to 2014 and found that the single largest creator of organic new employment was, amazingly, GameStop. The then-fast growing retail chain had an army of mostly part-time game enthusiasts who generally made just under $8 an hour. They were “the new face of job creation in America, ” wrote Davis, whose 2009 book Managed by the Markets is a wonderful history of the rise of the “ownership” society.
I contacted Davis, who is now at Stanford University working on a new book about the changing nature of the corporation, to ask his thoughts about GameStop and the controversy surrounding it. He sums up the big picture about as well as anyone could: “Rescuing an extremely low-wage employer from short-sellers by pumping up its stock is not exactly storming the Bastille.” What’s more, he adds, “Robinhood easing access to stock trading does not democratise the stock market any more than Purdue Pharma democratised opioid addiction. Democracy is about voice, not trading.”
I hope that politicians and regulators keep this core truth in mind during the coming hearings about GameStop and Robinhood. I fully expect Treasury secretary Janet Yellen will, based on her recent pledge to staff to address long-term inequality.
While apps and social media have led more people to trade shares, that has not made our system of market-driven capitalism stronger. Our economy is largely based on consumer spending, and that consumption rests on asset price inflation which can now be brewed up by teenagers in their bedrooms. If current employment trends continue, many of the latter will end up working gig economy jobs without a safety net to catch them when their portfolios collapse.
That is neither sustainable nor supportive of liberal democracy. That is why I applaud Joe Biden’s core economic promise to move the US economy from one that prioritises “wealth” to one that rewards work.
The details of the GameStop debacle should be parsed and any villains punished. But we must not lose sight of the main lesson: an economy in which individual fortunes are so closely tied to the health of the stock market rather than income growth is fragile. Speculation, no matter how widely shared, isn’t democracy.
Sunday, 7 February 2021
The Death of The Intellect
Nadeem Paracha in The Dawn
One point that supporters of Prime Minister Imran Khan really like to assert is that, “he is a self-made man.” They insist that the country should be led by people like him and not by those who were ‘born into wealth and power.’
According to the American historian Richard Hofstadter, such views are largely aired by the middle-classes. To Hofstadter, this view also has an element of ‘anti-intellectualism.’ In his 1963 book, Anti-intellectualism in American Life, Hofstadter writes that, as the middle-class manages to attain political influence, it develops a strong dislike for what it sees as a ‘political elite.’ But since this elite has more access to better avenues of education, the middle-class also develops an anti-intellectual attitude, insisting that, as a ruler, a self-made man is better than a better educated man.
Khan’s core support comes from Pakistan’s middle-classes. And even though he graduated from the prestigious Oxford University, he is more articulate when speaking about cricket — a sport that once turned him into a star — than about anything related to what he is supposed to be addressing as the country’s prime minister.
But many of his supporters do not have a problem with this, especially in contrast to his equally well-educated opponents, Bilawal Bhutto and Maryam Nawaz, who sound a lot more articulate in matters of politics. To Khan’s supporters, these two are from ‘dynastic elites’ who cannot relate to the sentiments of the ‘common people’ like a self-made man can.
It’s another matter that Khan is not the kind of self-made man that his supporters would like people to believe. He came from a well-to-do family that had roots in the country’s military-bureaucracy establishment. He went to prestigious educational institutions and spent most his youth as a socialite in London. Indeed, whereas the Bhutto and Sharif offsprings were born in wealth and power which is aiding their climb in politics, Khan’s political ambitions were carefully nurtured by the military-establishment.
Nevertheless, perhaps conscious of the fact that his personality is not suited to support an intellectual bent, Khan has positioned himself as a self-made man who appeals to the ways of the ‘common people.’ He doesn’t.
For example, wearing the national dress and using common everyday Urdu lingo does not cut it anymore. It did when the former PM Z.A. Bhutto did the same. But years after his demise in 1979, such ‘populist’ antics have become a worn-out cliche. The difference between the two is that Bhutto was a bonafide intellectual. Even his idea to present himself as a ‘people’s man’ was born from a rigorous intellectual scheme. However, Khan does appeal to that particular middle-class disposition that Hofstadter was writing about.
When he attempts to sound profound, his views usually appear to be a mishmash of theories of certain Islamic and so-called ‘post-colonial’ scholars. The result is rhetoric that actually ends up smacking of anti-intellectualism.
So what is anti-intellectualism? It is understood to be a view that is hostile to intellectuals. According to Walter E. Houghton, in the 1952 edition of the Journal of History of Ideas, the term’s first known usage dates back to 1881 in England, when science and ideas such as the ‘separation of religion and the state’, and the ‘supremacy of reason’ had gained momentum.
This triggered resentment in certain sections of the British society who began to suspect that intellectuals were formulating these ideas to undermine the importance of theology and long-held traditions.
According to the American historian Robert D. Cross, as populism started to become a major theme in American politics in the early 20th century, some mainstream politicians politicised anti-intellectualism as a way to portray themselves as men of the people. For example, US presidents Theodore Roosevelt (1901-1909) and Woodward Wilson (1913-1921) insisted that ‘character was more important than intellect.’
Across the 20th century, the politicised strand of anti-intellectualism was active in various regions. Communist regimes in China, the Soviet Union and Cambodia systematically eliminated intellectuals after describing them as remnants of overthrown bourgeoisie cultures. In Germany, the far-right intelligentsia differentiated between ‘passive intellectuals’ and ‘active intellectuals.’ Apparently, the passive intellectuals were abstract and thus useless whereas the active ones were ‘men of action.’ Hundreds of so-called passive intellectuals were harassed, exiled or killed in Nazi Germany.
In the 1950s, intellectuals in the US began to be suspected by firebrand members of the Republican Party of serving the interests of communist Russia. In former East Pakistan, hundreds of intellectuals were violently targeted for supporting Bengali nationalism.
But whereas these forms of anti-intellectualism were emerging from established political forces from both the left and the right, according to the American historian of science Michael Shermer, a more curious idea of anti-intellectualism began to develop within Western academia.
In the September 1, 2017 issue of Scientific American, Shermer writes that this was because ‘postmodernism’ had begun to ‘hijack’ various academic disciplines in the 1990s.
Postmodernism emerged in the 20th century as a critique of modernism. It derided modernism as a destructive force that had used its ideas of secularism, democracy, economic progress, science and reason as tools of subjugation. Shermer writes that, by the 1990s, postmodernism was positing that there was no objective truth and that science and empirical facts are tools of oppression. This is when even the celebrated leftist intellectual Noam Chomsky began to warn that postmodernism had turned anti-science.
‘Post-colonialism’ or the critique of the remnants of Western colonialism was very much a product of postmodernism as well. Oliver Lovesey in his book The Postcolonial Intellectual and the historian Arif Dirlik in the 1994 issue of The Critical Inquiry, take to task post-colonialism as a discipline now populated by non-white groups of academics who found themselves in positions of privilege in Western universities.
Lovesey quotes the Slovenian philosopher Slavoj Žižek as saying, “Post-colonialism is the invention of some rich guys from India who saw that they could make a good career in top Western universities by playing on the guilt of white liberals.”
Imran Khan is a classic example of how postmodernism and post-colonialism have become cynical anti-intellectual pursuits. Khan often reminds us that social and economic progress should not be undertaken to please the West because that smacks of a colonial mindset.
So, as his regime presides over a nosediving economy and severe political polarisation, the PM was recently reported (in the January 22 issue of The Friday Times) as discussing with his ministers whether he should mandate the wearing of the dupatta by all women TV anchors. Go figure.
One point that supporters of Prime Minister Imran Khan really like to assert is that, “he is a self-made man.” They insist that the country should be led by people like him and not by those who were ‘born into wealth and power.’
According to the American historian Richard Hofstadter, such views are largely aired by the middle-classes. To Hofstadter, this view also has an element of ‘anti-intellectualism.’ In his 1963 book, Anti-intellectualism in American Life, Hofstadter writes that, as the middle-class manages to attain political influence, it develops a strong dislike for what it sees as a ‘political elite.’ But since this elite has more access to better avenues of education, the middle-class also develops an anti-intellectual attitude, insisting that, as a ruler, a self-made man is better than a better educated man.
Khan’s core support comes from Pakistan’s middle-classes. And even though he graduated from the prestigious Oxford University, he is more articulate when speaking about cricket — a sport that once turned him into a star — than about anything related to what he is supposed to be addressing as the country’s prime minister.
But many of his supporters do not have a problem with this, especially in contrast to his equally well-educated opponents, Bilawal Bhutto and Maryam Nawaz, who sound a lot more articulate in matters of politics. To Khan’s supporters, these two are from ‘dynastic elites’ who cannot relate to the sentiments of the ‘common people’ like a self-made man can.
It’s another matter that Khan is not the kind of self-made man that his supporters would like people to believe. He came from a well-to-do family that had roots in the country’s military-bureaucracy establishment. He went to prestigious educational institutions and spent most his youth as a socialite in London. Indeed, whereas the Bhutto and Sharif offsprings were born in wealth and power which is aiding their climb in politics, Khan’s political ambitions were carefully nurtured by the military-establishment.
Nevertheless, perhaps conscious of the fact that his personality is not suited to support an intellectual bent, Khan has positioned himself as a self-made man who appeals to the ways of the ‘common people.’ He doesn’t.
For example, wearing the national dress and using common everyday Urdu lingo does not cut it anymore. It did when the former PM Z.A. Bhutto did the same. But years after his demise in 1979, such ‘populist’ antics have become a worn-out cliche. The difference between the two is that Bhutto was a bonafide intellectual. Even his idea to present himself as a ‘people’s man’ was born from a rigorous intellectual scheme. However, Khan does appeal to that particular middle-class disposition that Hofstadter was writing about.
When he attempts to sound profound, his views usually appear to be a mishmash of theories of certain Islamic and so-called ‘post-colonial’ scholars. The result is rhetoric that actually ends up smacking of anti-intellectualism.
So what is anti-intellectualism? It is understood to be a view that is hostile to intellectuals. According to Walter E. Houghton, in the 1952 edition of the Journal of History of Ideas, the term’s first known usage dates back to 1881 in England, when science and ideas such as the ‘separation of religion and the state’, and the ‘supremacy of reason’ had gained momentum.
This triggered resentment in certain sections of the British society who began to suspect that intellectuals were formulating these ideas to undermine the importance of theology and long-held traditions.
According to the American historian Robert D. Cross, as populism started to become a major theme in American politics in the early 20th century, some mainstream politicians politicised anti-intellectualism as a way to portray themselves as men of the people. For example, US presidents Theodore Roosevelt (1901-1909) and Woodward Wilson (1913-1921) insisted that ‘character was more important than intellect.’
Across the 20th century, the politicised strand of anti-intellectualism was active in various regions. Communist regimes in China, the Soviet Union and Cambodia systematically eliminated intellectuals after describing them as remnants of overthrown bourgeoisie cultures. In Germany, the far-right intelligentsia differentiated between ‘passive intellectuals’ and ‘active intellectuals.’ Apparently, the passive intellectuals were abstract and thus useless whereas the active ones were ‘men of action.’ Hundreds of so-called passive intellectuals were harassed, exiled or killed in Nazi Germany.
In the 1950s, intellectuals in the US began to be suspected by firebrand members of the Republican Party of serving the interests of communist Russia. In former East Pakistan, hundreds of intellectuals were violently targeted for supporting Bengali nationalism.
But whereas these forms of anti-intellectualism were emerging from established political forces from both the left and the right, according to the American historian of science Michael Shermer, a more curious idea of anti-intellectualism began to develop within Western academia.
In the September 1, 2017 issue of Scientific American, Shermer writes that this was because ‘postmodernism’ had begun to ‘hijack’ various academic disciplines in the 1990s.
Postmodernism emerged in the 20th century as a critique of modernism. It derided modernism as a destructive force that had used its ideas of secularism, democracy, economic progress, science and reason as tools of subjugation. Shermer writes that, by the 1990s, postmodernism was positing that there was no objective truth and that science and empirical facts are tools of oppression. This is when even the celebrated leftist intellectual Noam Chomsky began to warn that postmodernism had turned anti-science.
‘Post-colonialism’ or the critique of the remnants of Western colonialism was very much a product of postmodernism as well. Oliver Lovesey in his book The Postcolonial Intellectual and the historian Arif Dirlik in the 1994 issue of The Critical Inquiry, take to task post-colonialism as a discipline now populated by non-white groups of academics who found themselves in positions of privilege in Western universities.
Lovesey quotes the Slovenian philosopher Slavoj Žižek as saying, “Post-colonialism is the invention of some rich guys from India who saw that they could make a good career in top Western universities by playing on the guilt of white liberals.”
Imran Khan is a classic example of how postmodernism and post-colonialism have become cynical anti-intellectual pursuits. Khan often reminds us that social and economic progress should not be undertaken to please the West because that smacks of a colonial mindset.
So, as his regime presides over a nosediving economy and severe political polarisation, the PM was recently reported (in the January 22 issue of The Friday Times) as discussing with his ministers whether he should mandate the wearing of the dupatta by all women TV anchors. Go figure.
Saturday, 6 February 2021
The parable of John Rawls
Janan Ganesh in The FT
In the latest Pixar film, Soul, every human life starts out as a blank slate in a cosmic holding pen. Not until clerks ascribe personalities and vocations does the corporeal world open. As all souls are at their mercy, there is fairness of a kind. There is also chilling caprice. And so Pixar cuts the stakes by ensuring that each endowment is benign. No one ends up with dire impairments or unmarketable talents in the “Great Before”.
Kind as he was (a wry Isaiah Berlin, it is said, likened him to Christ), John Rawls would have deplored the cop-out. This year is the 50th anniversary of the most important tract of political thought in the last century or so. To tweak the old line about Plato, much subsequent work in the field amounts to footnotes to A Theory of Justice. Only some of this has to do with its conclusions. The method that yielded them was nearly as vivid.
Rawls asked us to picture the world we should like to enter if we had no warning of our talents. Nor, either, of our looks, sex, parents or even tastes. Don this “veil of ignorance”, he said, and we would maximise the lot of the worst-off, lest that turned out to be us. As we brave our birth into the unknown, it is not the average outcome that troubles us.
From there, he drew principles. A person’s liberties, which should go as far as is consistent with those of others, can’t be infringed. This is true even if the general welfare demands it. As for material things, inequality is only allowed insofar as it lifts the absolute level of the poorest. Some extra reward for the hyper-productive: yes. Flash-trading or Lionel Messi’s leaked contract: a vast no. Each of these rules puts a floor — civic and economic — under all humans.
True, the phrase-making helped (“the perspective of eternity”). So did the timing: 1971 was the Keynesian Eden, before Opec grew less obliging. But it was the depth and novelty of Rawls’s thought that brought him reluctant stardom.
Even those who denied that he had “won” allowed that he dominated. Utilitarians, once-ascendant in their stress on the general, said he made a God of the individual. The right, sure that they would act differently under the veil, asked if this shy scholar had ever met a gambler. But he was their reference point. And others’ too. A Theory might be the densest book to have sold an alleged 300,000 copies in the US alone. It triumphed.
And it failed. Soon after it was published, the course of the west turned right. The position of the worst-off receded as a test of the good society. Robert Nozick, Rawls’s libertarian Harvard peer, seemed the more relevant theorist. It was a neoliberal world that saw both men out in 2002.
An un-public intellectual, Rawls never let on whether he cared. Revisions to his theory, and their forewords, suggest a man under siege, but from academic quibbles not earthly events. For a reader, the joy of the book is in tracking a first-class mind as it husbands a thought from conception to expression. Presumably that, not averting Reaganism, was the author’s aim too.
And still the arc of his life captures a familiar theme. It is the ubiquity of disappointment — even, or especially, among the highest achievers. Precisely because they are capable of so much, some measure of frustration is their destiny. I think of Tony Blair, thrice-elected and still, post-Brexit, somehow defeated. (Sunset Boulevard, so good on faded actors, should be about ex-politicians.) Or of friends who have made fortunes but sense, and mind, that no one esteems or much cares about business.
The writer Blake Bailey tells an arresting story about Gore Vidal. The Sage of Amalfi was successful across all literary forms save poetry. He was rich enough to command one of the grandest residential views on Earth. If he hadn’t convinced Americans to ditch their empire or elect him to office, these were hardly disgraces. On that Tyrrhenian terrace, though, when a friend asked what more he could want, he said he wanted “200 million people” to “change their minds”. At some level, however mild his soul, so must have Rawls.
Subscribe to:
Posts (Atom)