Search This Blog

Thursday 27 September 2018

How did Sri Rama's idols suddenly appear in Babri Masjid on 22 December 1949?

Krishna Jha and Dhirendra K Jha in The Wire.In


The night was almost over. Ayodhya was still numb with sleep. Piercing through the quiet, a young sadhu, drenched in sweat, came scampering from Hanumangarhi, a fortress-like Hindu religious establishment housing over five hundred sadhus in Ayodhya. He had been sent to summon Satyendra Das to his guru, Abhiram Das, who seemed to be breathing his last. Those were the early hours of 3 December 1981, and a curtain was coming down over a few forgotten pages of history.

Dharam Das, the other disciple who stayed with Abhiram Das in his one-room tenement, the asan in Hanumangarhi, had asked for him so that they could be with their guru in his last moments. The news did not come as a shock. Satyendra Das had been almost awaiting the moment, since he had known for long that his guru was nearing the end of his journey. He had been at his bedside the whole day and the signs were not encouraging. Even when he had left Abhiram Das’s asan to get a breather after hours of tending to the terminally ill, he had a premonition that his guru – the man who had led a small band of Hindus to surreptitiously plant the idol of Lord Rama in Babri Masjid on yet another December night three decades ago – might not live long. After he had come away from the bedside, unwilling but tired to the bones, Satyendra Das was restless and unable to sleep. He dreaded the moment, yet knew that someone would knock on his doors with the news any time, and when it came, he responded fast, wrapped a quilt around himself and ran out along with the young sadhu who had come to fetch him.

It was very cold outside. The winter night was fading into a dense fog that smothered everything in its folds. Nothing was visible. The duo, almost running in total invisibility, knew the nooks and crannies of Ayodhya like the back of their hands. As Satyendra Das arrived at the asan, he saw Abhiram Das lying in the middle of the room on a charpoy, surrounded by a few sadhus from Hanumangarhi. No one spoke; it was very quiet. Only Dharam Das moved close to him and murmured softly that their guru had passed away minutes before he had stepped in. Slowly, as the day began to break, devotees and disciples started pouring into the room. Soon, preparations for the last rites of the deceased were begun with the help of some residents of Hanumangarhi.

The rituals for the final journey of ascetics are not the same as those for non-ascetic Hindu grihasthas, particularly in north India. Sadhus, unlike Hindu grihasthas, are rarely cremated. There are two options: either their bodies are smeared with salt and buried sitting in a meditative posture or they are dropped down a sacred river tied with a rock or sacks full of sand. The fact that sadhus who take vows of complete renunciation are not cremated symbolizes their separation from the material world. The claim goes that cremation for sadhus is superfluous since they have already burnt their attachments through ascetic initiation, opting for a life of austerities and renunciation.

In Ayodhya, the normal ascetic practice has been to immerse the body of a sadhu in the Sarayu – the name given to the river only as long as it touches the shores of the town. Before and after Ayodhya, the river is known as the Ghaghara. The reason for this nomenclatural confusion lies in a particular Hindu belief. As mythology has turned Ayodhya into the birthplace of Lord Rama, the river owing by it has also assumed the mythical name of Sarayu – the stream that is believed to have owed through the kingdom of Lord Rama.

Back in Hanumangarhi, by the noon of 3 December 1981, Abhiram Das’s disciples and friends had completed all preparations and were ready to initiate the final rituals for the deceased. Outside the asan, the body of Abhiram Das had been placed on a platform made of bamboo in a seated posture, his face frozen into a mask of self-control, his eyes half-closed as if he were deep in meditation. A saffron piece of cloth that had the name of Lord Rama printed all over – a particular kind of cotton or silk material called ramnami – had been carefully wrapped around his body. A similar cloth covered three sides of the arch made out of split bamboo that rested on the hard bamboo platform holding the corpse. The bamboo structure – euphemistically called viman to symbolize the mythical transporter of souls to the heavenly realm – had been kept uncovered on one side to enable people to have a last glimpse of the deceased.

Slowly, a group of sadhus lifted the viman on their shoulders and climbed up the flight of stairs leading to the temple of Lord Hanuman in the centre of Hanumangarhi. At the temple, the group swelled further and as the viman was taken out of Hanumangarhi, the motley crowd accompanying it chanted, ‘Ramajanmabhoomi Uddharak amar rahen (Long live the saviour of the birth place of Rama).’

Three decades back, on the morning of 23 December 1949, the First Information Report (FIR) registered by Ayodhya Police following the planting of the idol of Lord Rama in Babri Masjid on the night before had named Abhiram Das as the prime accused. He had also been tried for the crime he and his friends had committed that night, but the case had remained inconclusive. In course of time, many Hindus in Ayodhya had started calling him Ramajanmabhoomi Uddharak.


Krishna Jha and Dhirendra K. Jha, Ayodhya – The Dark 
Night, Harper Collins

The slogan-shouting grew louder as the viman reached the entrance of Babri Masjid, where it was carefully laid down. The priests of Ramajanmabhoomi, the temple that operated inside Babri Masjid ever since the idol was planted in it, as well as those of nearby Hindu religious establishments already knew about the demise of the sadhu, and they came out and garlanded the corpse and paid their homage to the departed soul.

By and large, however, Ayodhya remained unaware of Abhiram Das’s death. Though some residents looked at this funeral procession with curiosity, for the majority it was the demise of yet another old sadhu. After three decades, the historical facts associated with the developments in 1949 had slipped into obscurity. e propaganda of All India Hindu Mahasabha and Rashtriya Swayamsevak Sangh (RSS) – that the idol had never been planted and Lord Rama had manifested Himself at His place of birth – had gained ground among devout Hindus by now, largely delinking Abhiram Das from what he had done in the dark hours of that fateful night. Booklets and pamphlets written by Hindu communalists during the intervening period had flooded the shops of Ayodhya and had gone a long way in reinforcing the myth of ‘divine exercise’. For legal reasons, even those who had a role in that surreptitious act found it convenient to let the myth grow and capture popular imagination. e law, after all, could catch human conspiracies, but a ‘divine exercise’ was beyond its reach. Yet, to a small group of Hindus in Ayodhya, Abhiram Das continued to remain till his death Ramajanmabhoomi Uddharak or simply Uddharak Baba.

Whatever be the case, the lack of interest among locals could not be missed by many present in the cortège as it wound down the narrow lanes of Ayodhya and moved towards the banks of the Sarayu. On the bank, where the cortège reached at around two that afternoon, those carrying the viman on their shoulders bent down to put their burden on the ground. The sadhu’s body was taken out of it, bathed in the river and, after being smeared with ghee all over, was wrapped in a fresh white cloth. Two sand-filled sacks were tied to the back of the body, one beneath the shoulder and the other under the waist, which was then gently laid out in the boat that sailed o the moment Satyendra Das, Dharam Das and three other sadhus of Hanumangarhi boarded it. Within minutes, the boat reached the centre of the river, where it was no longer shallow and which had traditionally been used for such water burials. Those present on the boat performed the final rites before lifting Abhiram Das’s body and casting it into the cool, calm waters of the Sarayu.

II

The indifferent response that Abhiram Das’s death evoked among the local populace in 1981 was at odds with the atmosphere the town had witnessed three decades ago, during the years following Independence. At that time, many in Ayodhya, as in several other parts of the country, had seen things differently. The communal frenzy which had accompanied the partition of India had intensely brutalized the atmosphere. No less important was the role played by organizations which saw the immediate aftermath of Partition as an opportunity to derail the secular project of independent India. e conspirators associated with these organizations and the conspiracies they hatched had already resulted in major national tragedies.

One such was the gruesome murder of Mahatma Gandhi on 30 January 1948. The hands that pumped bullets into the chest of the Mahatma were that of Nathuram Godse, but, as was proved later, the assassination was part of a conspiracy hatched by top Hindu Mahasabha leaders, led by V.D. Savarkar, whose prime objectives were to snatch political initiative from the Congress and destabilize all efforts to uphold secularism in India. The conspiracy to kill Gandhi could not remain hidden for long even though the trial, held immediately after the assassination, had failed to uncover its extent.

The surreptitious occupation of the Babri Masjid was an act planned by almost the same set of people about two years later – on the night of 22 December 1949. It was, in many ways, a reflection of the same brutalized atmosphere that saw Gandhi being murdered. Neither the conspirators nor their underlying objectives were different. In both instances, the conspirators belonged to the Hindu Mahasabha leadership – some of the prime movers of the planting of the idol had been the prime accused in the Gandhi murder case – and their objective this time too was to wrest the political centre stage from the Congress by provoking large-scale Hindu mobilization in the name of Lord Rama.

Yet the two incidents differed – as much in the modus operandi used by Hindu communalists as in the manner in which the government and the ruling party, the Congress, responded to them. While the Mahatma was killed in full public view in broad daylight, the Babri Masjid was converted into a temple secretly, in the dead of night. Apparently, the quick and massive government reprisal in the aftermath of Gandhi’s assassination had taught the Hindu Mahasabha leaders several lessons. One was to avoid confrontation with the government so that they could extract maximum political advantage out of their act. Another was to involve a section of the Congress that was sympathetic to their cause. So when, two years later, they set out to execute the Ayodhya project, they remained extremely careful, keeping themselves in the backstage until the mosque was actually impounded and ensuring a large-scale mobilization of Hindus in the immediate aftermath without wasting any time. Though the political objective they had planned through this act of communal aggression in Ayodhya could not be achieved in the manner they had hoped for, they greatly succeeded in keeping the story of the night and the conspiracy behind it a secret, for it never came out in its entirety.

Also, while the conspiracy to kill the Mahatma was probed thoroughly by a commission set up by the Government of India albeit two decades later, no such inquiry was conducted to unmask the plot and the plotters behind the forcible conversion of the Babri Masjid into a temple. As a result, an event that so remarkably changed the political discourse in India continues to be treated as a localized crime committed spontaneously by a handful of local people led, of course, by Abhiram Das, a local sadhu. It was, however, a well-planned conspiracy involving national-, provincial- and local-level leaders of the Hindu Mahasabha undertaken with he objective of reviving the party’s political fortunes that were lost in the aftermath of the Gandhi assassination.

Time has further pushed the secret story of the Hindu Mahasabha’s Ayodhya strategy into obscurity, leaving only what is most apparent for public debate. The unending process of litigation which it triggered completely shifted the focus away from that fateful night and has now become the basis of communal politics in the country. Incidentally, the most crucial part of the controversy – the hidden one – remains an ignored area of research. For instance, the White Paper on the Babri Masjid–Ramajanmabhoomi dispute of the Government of India dismissed the incident of 1949 – legally the root cause of the dispute – in just one paragraph. Issued in the aftermath of the demolition of the mosque on 6 December 1992, the document does not have more to say on the incident:


The controversy entered a new phase with the placing of idols in the disputed structure in December 1949. The premises were attached under Section 145 of the Code of Criminal Procedure. Civil suits were led shortly thereafter. Interim orders in these civil suits restrained the parties from removing the idols or interfering with their worship. In effect, therefore, from December 1949 till December 6, 1992 the structure had not been used as a mosque.

It seems impertinent to say that so little is known about the night of 22–23 December 1949 since, in a sense, almost the entire dispute over the mosque emanates from the appearance of the idol of Rama inside that structure. Nevertheless, it is true that there has been little research by contemporary or later writers to fill the gap. This missing link of history remained out of focus till the issue was politically revived and strengthened by the Vishwa Hindu Parishad (VHP) in the mid-1980s. And by then the story of the night had been taken over by the politics of communalism and the debate over the proprietorship of the disputed land. 

But till Lord Rama ‘manifested’ Himself inside the Babri Masjid, all moves had sought to construct the temple at Ramachabutara, an elevated platform outside the inner courtyard of the mosque. Only after the idols were placed inside did the demand for converting the Muslim place of worship into a temple enter the legal arena. And yet the development of that night did not attract much attention in the media when it actually took place. No major newspaper or journal of the time gave it the kind of serious coverage it deserved even though the import of the development was not at all lost on Congress leaders like Jawaharlal Nehru, Sardar Vallabhbhai Patel, Govind Ballabh Pant and Akshay Brahmachary as well as Hindu Mahasabha president N.B. Khare, its vice-president V.G. Deshpande and its all India general secretary and president of the party’s UP unit Mahant Digvijai Nath.

The only journal that covered the events in detail was a local Hindi weekly in Ayodhya called Virakta. Its editor, Ramgopal Pandey ‘Sharad’, was a known Mahasabhaite. The kind of material that Virakta published had a pronounced Hindu communal bias, and it was hardly expected to carry objective reportage on the developments. If anything, this journal was the first to promote the theory of ‘divine exercise’ – though in bits and pieces – to explain the appearance of the idol of Lord Rama inside the mosque.

Later, Ramgopal Pandey ‘Sharad’ wrote a booklet in Hindi – Shree Ramjanmabhoomi Ka Rakta Ranjit Itihaas (The Blood-soaked History of the Birth Place of Lord Rama). In Ayodhya, this has remained the most popular and perhaps only available material on the subject ever since. Like Virakta, this booklet, too, explains the developments of that night in terms of divine intervention rather than as a communal tactic conceived and executed by the Mahasabha in collaboration with local communalists. is is what the booklet says:


Twenty-third December 1949 was a glorious day for India. On that day, after a long gap of about four hundred years, the birth place of Lord Rama was redeemed. e way developments happened [on the night before], it can be said that Lord Rama himself redeemed his place of birth.

While this theory was being used by communalists to explain the mystery of those dark hours, no serious attempt was made to explore the events of that night objectively, neither by the government nor by any institutions or individual researchers. Debunking the theory of ‘divine exercise’ is one thing (and there is no dearth of works in this regard), but unravelling the truth that was sought to be covered is something else.

Surely, part of the reason why the facts could not come out as and when they occurred – as happened in case of Mahatma Gandhi’s assassination – had greatly to do with the power politics of the time. After the assassination of Gandhi in 1948 until the death of Sardar Vallabhbhai Patel in 1950, the Congress party was beset with an intense intra-party power struggle. Though it had witnessed factional fights earlier as well, there had always been an element of restraint under the influence of Mahatma Gandhi and the idealism of the freedom struggle. But as soon as these restraints disappeared, the fight between the two power blocs in the Congress – Hindu conservatives led by Patel and secularists led by Nehru – came out in the open.

The United Provinces, in particular, emerged as one of the main battlegrounds for these power blocs in the Congress, merely months after Gandhi’s assassination. Govind Ballabh Pant, the chief minister of the province (called prime minister before adoption of the Constitution on 26 January 1950), was a staunch loyalist of Patel. His desperation to remove all those who appeared to be potential challengers to his authority in the state Congress led him to align with Hindu revivalists in Ayodhya – a move that, apart from paying him dividends, greatly emboldened Mahasabhaites and set the ground for the eventual appearance of the idols at the Babri Masjid.

With the Hindu conservative faction of the Congress, in a bid to neutralize Nehru, openly trying to outsource political strength from communal elements outside the party, and the latter endeavouring to arrest this political drift and salvage its own position, there was hardly much time, or determination, to probe the misdeeds of the Mahasabhaites. This was even more so in the United Provinces where the government appeared to be more interested in protecting the Hindu communalists than bringing them to book.

By the time this battle was won by Nehru in late 1950, the incidents of the night of 22 December 1949 had got lost in legal thickets, and the mood of the nation had changed, with the secular fabric seemingly no longer threatened by Hindu revivalists. As the focus shifted following the promulgation of the Constitution of India on 26 January 1950, almost all the players of the Hindu Mahasabha’s Ayodhya strategy either lost their relevance or, in cases where some of them managed to remain in currency, their ability to break the secular equilibrium got severely restricted and their link with the night became part of this missing link of modern India’s history.

Trump has a point about globalisation

Larry Elliott in The Guardian


The president’s belief that the nation state can cure economic ills is not without merit


  
‘The stupendous growth posted by China over the past four decades has been the result of doing the opposite of what the globalisation textbooks recommend.’ Photograph: AFP/Getty Images


Once every three years the International Monetary Fund and the World Bank hold their annual meetings out of town. Instead of schlepping over to Washington, the gathering of finance ministers and central bank governors is hosted by a member state. Ever since the 2000 meeting in Prague was besieged by anti-globalisation rioters, the away fixtures have tended to be held in places that are hard to get to or where the regime tends to take a dim view of protest: Singapore, Turkey, Peru.

This year’s meeting will take place in a couple of weeks on the Indonesian island of Bali, where the IMF and the World Bank can be reasonably confident that the meetings will not be disrupted. At least not from the outside. The real threat no longer comes from balaclava-wearing anarchists throwing Molotov cocktails but from within. Donald Trump is now the one throwing the petrol bombs and for multilateral organisations like the IMF and World Bank, that poses a much bigger threat.

The US president put it this way in his speech to the United Nations on Tuesday: “We reject the ideology of globalism and we embrace the doctrine of patriotism.” For decades, the message from the IMF has been that breaking down the barriers to trade, allowing capital to move unhindered across borders and constraining the ability of governments to regulate multinational corporations was the way to prosperity. Now the most powerful man on the planet is saying something different: that the only way to remedy the economic and social ills caused by globalisation is through the nation state. Trump’s speech was mocked by fellow world leaders, but the truth is that he’s not a lone voice.

The world’s other big economic superpower – China – has never given up on the nation state. Xi Jinping likes to use the language of globalisation to make a contrast with Trump’s protectionism, but the stupendous growth posted by China over the past four decades has been the result of doing the opposite of what the globalisation textbooks recommend. The measures traditionally frowned upon by the IMF – state-run industries, subsidies, capital controls – have been central to Beijing’s managed capitalism. China has certainly not closed itself off from the global economy but has engaged on its own terms. When the communist regime wanted to move people out of the fields and into factories it did so through the mechanism of an undervalued currency, which made Chinese exports highly competitive. When the party decided that it wanted to move into more sophisticated, higher-tech manufacturing, it insisted that foreign companies wishing to invest in China share their intellectual property.

This sort of approach isn’t new. It was the way most western countries operated in the decades after the second world war, when capital controls, managed immigration and a cautious approach to removing trade barriers were seen as necessary if governments were to meet public demands for full employment and rising living standards. The US and the EU now say that China is not playing fair because it has been prospering with an economic strategy that is supposed not to work. There is some irony in this.

The idea that the nation state would wither away was based on three separate arguments. The first was that the barriers to the global free movement of goods, services, people and money were economically inefficient and that removing them would lead to higher levels of growth. This has not been the case. Growth has been weaker and less evenly shared.

The second was that governments couldn’t resist globalisation even if they wanted to. This was broadly the view once adopted by Bill Clinton and Tony Blair, and now kept alive by Emmanuel Macron. The message to displaced workers was that the power of the market was – rather like a hurricane or a blizzard – an irresistible force of nature. This has always been a dubious argument because there is no such thing as a pure free market. Globalisation has been shaped by political decisions, which for the past four decades have favoured the interests of capital over labour.
Finally, it was argued that the trans-national nature of modern capitalism made the nation state obsolete. Put simply, if economics was increasingly global then politics had to go global, too. There is clearly something in this because financial markets impose constraints on individual governments and it would be preferable for there to be a form of global governance pushing for stability and prosperity for all. The problem is that to the extent such an institutional mechanism exists, it has been captured by the globalists. That is as true of the EU as it is of the IMF.

So while the nation state is far from perfect, it is where an alternative to the current failed model will inevitably begin. Increasingly, voters are looking to the one form of government where they do have a say to provide economic security. And if the mainstream parties are not prepared to offer what these voters want – a decently paid job, properly funded public services and controls on immigration – then they will look elsewhere for parties or movements that will. This has proved to be a particular problem for the parties of the centre left – the Democrats in the US, New Labour in Britain, the SDP in Germany – that signed up to the idea that globalisation was an unstoppable force.

Jeremy Corbyn certainly does not accept the idea that the state is obsolete as an economic actor. The plan is to build a different sort of economy from the bottom up – locally and nationally. That’s not going to be easy but beats the current, failed, top-down approach.

Tuesday 25 September 2018

Why western philosophy can only teach us so much

Julian Baggini in The Guardian

One of the great unexplained wonders of human history is that written philosophy first flowered entirely separately in different parts of the globe at more or less the same time. The origins of Indian, Chinese and ancient Greek philosophy, as well as Buddhism, can all be traced back to a period of roughly 300 years, beginning in the 8th century BC.

These early philosophies have shaped the different ways people worship, live and think about the big questions that concern us all. Most people do not consciously articulate the philosophical assumptions they have absorbed and are often not even aware that they have any, but assumptions about the nature of self, ethics, sources of knowledge and the goals of life are deeply embedded in our cultures and frame our thinking without our being aware of them.

Yet, for all the varied and rich philosophical traditions across the world, the western philosophy I have studied for more than 30 years – based entirely on canonical western texts – is presented as the universal philosophy, the ultimate inquiry into human understanding. Comparative philosophy – study in two or more philosophical traditions – is left almost entirely to people working in anthropology or cultural studies. This abdication of interest assumes that comparative philosophy might help us to understand the intellectual cultures of India, China or the Muslim world, but not the human condition.

This has become something of an embarrassment for me. Until a few years ago, I knew virtually nothing about anything other than western philosophy, a tradition that stretches from the ancient Greeks to the great universities of Europe and the US. Yet, if you look at my PhD certificate or the names of the university departments where I studied, there is only one, unqualified, word: philosophy. Recently and belatedly, I have been exploring the great classical philosophies of the rest of the world, travelling across continents to encounter them first-hand. It has been the most rewarding intellectual journey of my life.

My philosophical journey has convinced me that we cannot understand ourselves if we do not understand others. Getting to know others requires avoiding the twin dangers of overestimating either how much we have in common or how much divides us. Our shared humanity and the perennial problems of life mean that we can always learn from and identify with the thoughts and practices of others, no matter how alien they might at first appear. At the same time, differences in ways of thinking can be both deep and subtle. If we assume too readily that we can see things from others’ points of view, we end up seeing them from merely a variation of our own.

To travel around the world’s philosophies is an opportunity to challenge the beliefs and ways of thinking we take for granted. By gaining greater knowledge of how others think, we can become less certain of the knowledge we think we have, which is always the first step to greater understanding.

Take the example of time. Around the world today, time is linear, ordered into past, present and future. Our days are organised by the progression of the clock, in the short to medium term by calendars and diaries, history by timelines stretching back over millennia. All cultures have a sense of past, present and future, but for much of human history this has been underpinned by a more fundamental sense of time as cyclical. The past is also the future, the future is also the past, the beginning also the end.

The dominance of linear time fits in with an eschatological worldview in which all of human history is building up to a final judgment. This is perhaps why, over time, it became the common-sense way of viewing time in the largely Christian west. When God created the world, he began a story with a beginning, a middle and an end. As Revelation puts it, while prophesying the end times, Jesus is this epic’s “Alpha and Omega, the beginning and the end, the first and the last”.

But there are other ways of thinking about time. Many schools of thought believe that the beginning and the end are and have always been the same because time is essentially cyclical. This is the most intuitively plausible way of thinking about eternity. When we imagine time as a line, we end up baffled: what happened before time began? How can a line go on without end? A circle allows us to visualise going backwards or forwards for ever, at no point coming up against an ultimate beginning or end.

Thinking of time cyclically especially made sense in premodern societies, where there were few innovations across generations and people lived very similar lives to those of their grandparents, their great-grandparents and going back many generations. Without change, progress was unimaginable. Meaning could therefore only be found in embracing the cycle of life and death and playing your part in it as best you could.


Confucius (551-479 BC). Photograph: Getty

Perhaps this is why cyclical time appears to have been the human default. The Mayans, Incans and Hopi all viewed time in this way. Many non-western traditions contain elements of cyclical thinking about time, perhaps most evident in classical Indian philosophy. The Indian philosopher and statesman Sarvepalli Radhakrishnan wrote: “All the [orthodox] systems accept the view of the great world rhythm. Vast periods of creation, maintenance and dissolution follow each other in endless succession.” For example, a passage in the Rig Veda addressing Dyaus and Prithvi (heaven and earth) reads: “Which was the former, which of them the latter? How born? O sages, who discerns? They bear themselves all that has existence. Day and night revolve as on a wheel.”

East Asian philosophy is deeply rooted in the cycle of the seasons, part of a larger cycle of existence. This is particularly evident in Taoism, and is vividly illustrated by the surprising cheerfulness of the 4th century BC Taoist philosopher Zhuangzi when everyone thought he should have been mourning for his wife. At first, he explained, he was as miserable as anyone else. Then he thought back beyond her to the beginning of time itself: “In all the mixed-up bustle and confusion, something changed and there was qi. The qi changed and there was form. The form changed and she had life. Today there was another change and she died. It’s just like the round of four seasons: spring, summer, autumn and winter.”

In Chinese thought, wisdom and truth are timeless, and we do not need to go forward to learn, only to hold on to what we already have. As the 19th- century Scottish sinologist James Legge put it, Confucius did not think his purpose was “to announce any new truths, or to initiate any new economy. It was to prevent what had previously been known from being lost.” Mencius, similarly, criticised the princes of his day because “they do not put into practice the ways of the ancient kings”. Mencius also says, in the penultimate chapter of the eponymous collection of his conversations, close to the book’s conclusion: “The superior man seeks simply to bring back the unchanging standard, and, that being correct, the masses are roused to virtue.” The very last chapter charts the ages between the great kings and sages.

A hybrid of cyclical and linear time operates in strands of Islamic thought. “The Islamic conception of time is based essentially on the cyclic rejuvenation of human history through the appearance of various prophets,” says Seyyed Hossein Nasr, professor emeritus of Islamic studies at George Washington University. Each cycle, however, also moves humanity forward, with each revelation building on the former – the dictation of the Qur’an to Muhammad being the last, complete testimony of God – until ultimately the series of cycles ends with the appearance of the Mahdi, who rules for 40 years before the final judgment.

The distinction between linear and cyclical time is therefore not always neat. The assumption of an either/or leads many to assume that oral philosophical traditions have straightforwardly cyclical conceptions of time. The reality is more complicated. Take Indigenous Australian philosophies. There is no single Australian first people with a shared culture, but there are enough similarities across the country for some tentative generalisations to be made about ideas that are common or dominant. The late anthropologist David Maybury-Lewis suggested that time in Indigenous Australian culture is neither cyclical nor linear; instead, it resembles the space-time of modern physics. Time is intimately linked to place in what he calls the “dreamtime” of “past, present, future all present in this place”.

“One lives in a place more than in a time,” is how Stephen Muecke puts it in his book Ancient and Modern: Time, Culture and Indigenous Philosophy. More important than the distinction between linear or cyclical time is whether time is separated from or intimately connected to place. Take, for example, how we conceive of death. In the contemporary west, death is primarily seen as the expiration of the individual, with the body as the locus, and the location of that body irrelevant. In contrast, Muecke says: “Many indigenous accounts of the death of an individual are not so much about bodily death as about a return of energy to the place of emanation with which it re-identifies.”

Such a way of thinking is especially alien to the modern west, where a pursuit of objectivity systematically downplays the particular, the specifically located. In a provocative and evocative sentence, Muecke says: “Let me suggest that longsightedness is a European form of philosophical myopia and that other versions of philosophy, indigenous perhaps, have a more lived-in and intimate association with societies of people and the way they talk about themselves.”

Muecke cites the Australian academic Tony Swain’s view that the concept of linear time is a kind of fall from place. “I’ve got a hunch that modern physics separated out those dimensions and worked on them, and so we produced time as we know it through a whole lot of experimental and theoretical activities,” Muecke told me. “If you’re not conceptually and experimentally separating those dimensions, then they would tend to flow together.” His indigenous friends talk less of time or place independently, but more of located events. The key temporal question is not “When did this happen?” but “How is this related to other events?”

That word related is important. Time and space have become theoretical abstractions in modern physics, but in human culture they are concrete realities. Nothing exists purely as a point on a map or a moment in time: everything stands in relation to everything else. So to understand time and space in oral philosophical traditions, we have to see them less as abstract concepts in metaphysical theories and more as living conceptions, part and parcel of a broader way of understanding the world, one that is rooted in relatedness. Hirini Kaa, a lecturer at the University of Auckland, says that “the key underpinning of Maori thought is kinship, the connectedness between humanity, between one another, between the natural environment”. He sees this as a form of spirituality. “The ocean wasn’t just water, it wasn’t something for us to be afraid of or to utilise as a commodity, but became an ancestor deity, Tangaroa. Every living thing has a life force.”

David Mowaljarlai, who was a senior lawman of the Ngarinyin people of Western Australia, once called this principle of connectivity “pattern thinking”. Pattern thinking suffuses the natural and the social worlds, which are, after all, in this way of thinking, part of one thing. As Muecke puts it: “The concept of connectedness is, of course, the basis of all kinship systems [...] Getting married, in this case, is not just pairing off, it is, in a way, sharing each other.”

The emphasis on connectedness and place leads to a way of thinking that runs counter to the abstract universalism developed to a certain extent in all the great written traditions of philosophy. Muecke describes as one of the “enduring [Indigenous Australian] principles” that “a way of being will be specific to the resources and needs of a time and place and that one’s conduct will be informed by responsibility specific to that place”. This is not an “anything goes” relativism, but a recognition that rights, duties and values exist only in actual human cultures, and their exact shape and form will depend on the nature of those situations.

 
A Mayan calendar. Photograph: Alamy Stock Photo

This should be clear enough. But the tradition of western philosophy, in particular, has striven for a universality that glosses over differences of time and place. The word “university”, for example, even shares the same etymological root as “universal”. In such institutions, “the pursuit of truth recognises no national boundaries”, as one commentator observed. Place is so unimportant in western philosophy that, when I discovered it was the theme of the quinquennial East-West Philosophers’ Conference in 2016, I wondered if there was anything I could bring to the party at all. (I decided that the absence of place in western philosophy itself merited consideration.)

The universalist thrust has many merits. The refusal to accept any and every practice as a legitimate custom has bred a very good form of intolerance for the barbaric and unjust traditional practices of the west itself. Without this intolerance, we would still have slavery, torture, fewer rights for women and homosexuals, feudal lords and unelected parliaments. The universalist aspiration has, at its best, helped the west to transcend its own prejudices. At the same time, it has also legitimised some prejudices by confusing them with universal truths. The philosopher Kwame Anthony Appiah argues that the complaints of anti-universalists are not generally about universalism at all, but pseudo-universalism, “Eurocentric hegemony posing as universalism”. When this happens, intolerance for the indefensible becomes intolerance for anything that is different. The aspiration for the universal becomes a crude insistence on the uniform. Sensitivity is lost to the very different needs of different cultures at different times and places.

This “posing as universalism” is widespread and often implicit, with western concepts being taken as universal but Indian ones remaining Indian, Chinese remaining Chinese, and so on. To end this pretence, Jay L Garfield and Bryan W Van Norden propose that those departments of philosophy that refuse to teach anything from non-western traditions at least have the decency to call themselves departments of western philosophy.

The “pattern thinking” of Maori and Indigenous Australian philosophies could provide a corrective to the assumption that our values are the universal ones and that others are aberrations. It makes credible and comprehensible the idea that philosophy is never placeless and that thinking that is uprooted from any land soon withers and dies.

Mistrust of the universalist aspiration, however, can go too far. At the very least, there is a contradiction in saying there are no universal truths, since that is itself a universal claim about the nature of truth. The right view probably lies somewhere between the claims of naive universalists and those of defiant localists. There seems to be a sense in which even the universalist aspiration has to be rooted in something more particular. TS Eliot is supposed to have said: “Although it is only too easy for a writer to be local without being universal, I doubt whether a poet or novelist can be universal without being local, too.” To be purely universal is to inhabit an abstract universe too detached from the real world. But just as a novelist can touch on universals of the human condition through the particulars of a couple of characters and a specific story, so our different, regional philosophical traditions can shed light on more universal philosophical truths even though they approach them from their own specific angles.

We should not be afraid to ground ourselves in our own traditions, but we should not be bound by them. Gandhi put this poetically when he wrote: “I do not want my house to be walled in on all sides and my windows to be stuffed. I want the cultures of all lands to be blown about my house as freely as possible. But I refuse to be blown off my feet by any. I refuse to live in other people’s houses as an interloper, a beggar or a slave.”

In the west, the predominance of linear time is associated with the idea of progress that reached its apotheosis in the Enlightenment. Before this, argues the philosopher Anthony Kenny, “people looking for ideals had looked backwards in time, whether to the primitive church, or to classical antiquity, or to some mythical prelapsarian era. It was a key doctrine of the Enlightenment that the human race, so far from falling from some earlier eminence, was moving forward to a happier future.”

Kenny is expressing a popular view, but many see the roots of belief in progress deeper in the Christian eschatological religious worldview. “Belief in progress is a relic of the Christian view of history as a universal narrative,” claims John Gray. Secular thinkers, he says, “reject the idea of providence, but they continue to think humankind is moving towards a universal goal”, even though “the idea of progress in history is a myth created by the need for meaning”.

Whether faith in progress is an invention or an adaptation of the Enlightenment, the image of secular humanists naively believing humanity is on an irreversible, linear path of advancement seems to me a caricature of their more modest hope, based in history, that progress has occurred and that more is possible. As the historian Jonathan Israel says, Enlightenment ideas of progress “were usually tempered by a strong streak of pessimism, a sense of the dangers and challenges to which the human condition is subject”. He dismisses the idea that “Enlightenment thinkers nurtured a naive belief in man’s perfectibility” as a “complete myth conjured up by early 20th-century scholars unsympathetic to its claims”.

Nevertheless, Gray is right to point out that linear progress is a kind of default way of thinking about history in the modern west and that this risks blinding us to the ways in which gains can be lost, advances reversed. It also fosters a sense of the superiority of the present age over earlier, supposedly less advanced” times. Finally, it occludes the extent to which history doesn’t repeat itself but does rhyme.

The different ways in which philosophical traditions have conceived time turn out to be far from mere metaphysical curiosities. They shape the way we think about both our temporal place in history and our relation to the physical places in which we live. It provides one of the easiest and clearest examples of how borrowing another way of thinking can bring a fresh perspective to our world. Sometimes, simply by changing the frame, the whole picture can look very different.

Monday 24 September 2018

Labour’s just declared class war. Has anybody noticed?

Aditya Chakrabortty in The Guardian

You’d expect a declaration of class war by the main opposition party to merit at least a mention in the country’s tabloids. After all, on Sunday night Labour announced a SIX BILLION POUND RAID ON BUSINESS – the kind of thing one might reasonably hope to be screamed in huge font across the front pages and condemned in fist-shaking, bloodcurdling editorials. But nothing. Barely a squeak.

Just why that should be I’ll discuss in a moment, but first there is the policy itself – and a big, bold thing it is. At Labour conference on Monday, John McDonnell declared that he plans to force all companies with more than 250 staff to put 10% of their equity into a fund for their workers. Each employee will then be entitled to company share dividends worth up to £500 a year. Any extra will go back into public services.

The sums involved are massive: Labour calculates that 10.7 million workers covered by the scheme will get about £4bn a year in share dividends by the end of Jeremy Corbyn’s first term in government, while the public sector will receive an annual £2bn.

This also represents a big shift in Labour’s thinking. A few days ago, I met a senior aide to the previous party leader Ed Miliband who talked for a while about how, for all the rhetoric deployed by the new team, little of substance had changed in policy. “Apart from this stuff about a worker fund,” he mused. “Now that is big.” 

Big indeed. This isn’t just about giving employees more money; it’s handing them a stake and a voice in the enterprises on which they spend most of their waking hours.

To do so, McDonnell will use the stick rather than the carrot, compulsion rather than encouragement over china teacups. His argument is that shareholders are not the only ones entitled to company profits: the employees and the rest of society (which pays for the infrastructure used by businesses and allows them the great privilege of limited liability) also have a claim. No wonder business lobby groups are furious, with Confederation of British Industry director general Carolyn Fairbairn decrying a “diktat” that will have investors “packing their bags”.

For decades, the British have practised a carelessness that lets the people wielding the biggest chequebooks buy whichever assets they like and do whatever they want. That attitude has allowed Philip Green to strip BHS to the bones, Kraft to run Cadbury into the ground, and Thames Water to be picked over by a consortium of international investors.

The rewards from all this carnage have flowed to one group: shareholders. In 2015, Bank of England chief economist Andy Haldane charted what has happened to workers’ share of national income over the long-run. He found that labour had been getting smaller and smaller slices of the pie: from 70% in the 1970s to 55% now. By his reckoning, employees get proportionately less now than they did at the very outset of the Industrial Revolution in the 1770s.

Had workers’ wages kept track with the rise in their productivity since 1990, the average employee would today be 20% better off. Or they could have three-day weekends all year long and still get paid the same.

To secure a real rise in wages will require more than waiting for the economy to recover from its decade-long slump and the labour market to return to “normal”. Hence today’s announcement. This is not to say I think it’s perfect. It won’t touch the likes of Google and Facebook, because they’re listed abroad. It’s not clear to me how it will affect Amazon, which has relatively few direct employees but warehouses full of agency workers. Labour says it has suggestions – I’m not sure Jeff Bezos will be listening. The opposition’s key challenge remains largely unaddressed, which is how to get private sector businesses to behave as if they’re part of the society in which they operate.

Like Labour’s tax on second homes, you can see what the party is getting at even while thinking that the proposals as they stand are likely to be gamed.

At the same time, it’s especially difficult for Theresa May to oppose. Don’t the Conservatives boast of being the party of shareholder democracy (even though share ownership has become less widespread since Margaret Thatcher came to power)? Didn’t David Cameron commission a report into companies owned by their employees, the first line of which read: “Employee ownership is a great idea.” ? And doesn’t all the evidence show that companies owned by their workers are more productive and stick around for longer?

The Tories’ uneasiness over how to respond accounts for part of newspapers’ silence on this Labour proposal. Add to that the agonies over Brexit that will be played out over the next two weeks of conference. But the closer Labour edge to power, the more scrutiny ideas like this will receive.

As far as I know, McDonnell’s policy has been tried in one other comparable situation. In the early 80s, Sweden’s Social Democrats promised to give 20% of company shares to workers. Named after its architect, trade union economist Rudolf Meidner, the policy was popular with the party faithful.

But in this polite and outwardly cohesive country, it caused outright war, writes Robin Blackburn in his classic history Banking on Death: “Business leaders were intensely alarmed and spent five times more money attacking the plan than the cash laid out by all the parties on the 1982 election. The privately-owned press ran a sustained and vigorous campaign … under assault, support for the scheme ebbed and the Social Democrat leaders believed that it was prudent greatly to dilute the scheme…” By the mid-90s, the policy was dead.

A warning there for all those gathering in Liverpool this week – and for anyone who believes workers should receive a greater share of the stuff we produce: get ready for the onslaught.

Thursday 20 September 2018

Let’s face it. Our university factory has failed to deliver on its promises

Aditya Chakrabortty in The Guardian







In any other area it would be called mis-selling. Given the sheer numbers of those duped, a scandal would erupt and the guilty parties would be forced to make amends. In this case, they’d include some of the most eminent politicians in Britain.

But we don’t call it mis-selling. We refer to it instead as “going to uni”. Over the next few days, about half a million people will start as full-time undergraduates. Perhaps your child will be among them, bearing matching Ikea crockery and a fleeting resolve to call home every week.

They are making one of the biggest purchases of their lives, shelling out more on tuition fees and living expenses than one might on a sleek new Mercedes, or a deposit on a London flat. Many will emerge with a costly degree that fulfils few of the promises made in those glossy prospectuses. If mis-selling is the flogging of a pricey product with not a jot of concern about its suitability for the buyer, then that is how the establishment in politics and in higher education now treat university degrees. The result is that tens of thousands of young graduates begin their careers having already been swindled as soundly as the millions whose credit card companies foisted useless payment protection insurance on them.

Rather than jumping through hoop after hoop of exams and qualifications, they’d have been better off with parents owning a home in London. That way, they’d have had somewhere to stay during internships and then a source of equity with which to buy their first home – because ours is an era that preaches social mobility, even while practising a historic concentration of wealth. Our new graduates will learn that the hard way.

To say as much amounts to whistling in the wind. With an annual income of £33bn, universities in the UK are big business, and a large lobby group. They are perhaps the only industry whose growth has been explicitly mandated by prime ministers of all stripes, from Tony Blair to Theresa May. It was Blair who fed the university sector its first steroids, by pledging that half of all young Britons would go into higher education. That sweeping target was set with little regard for the individual needs of teenagers – how could it be? Sub-prime brokers in Florida were more exacting over their clients’ circumstances. It was based instead on two promises that have turned out to be hollow.

Promise number one was that degrees mean inevitably bigger salaries. This was a way of selling tuition fees to voters. Blair’s education secretary, David Blunkett, asked: “Why should it be the woman getting up at 5 o’clock to do a cleaning job who pays for the privileges of those earning a higher income while they make no contribution towards it?” When David Cameron’s lot wanted to jack up fees, they claimed a degree was a “phenomenal investment”.

Both parties have marketed higher education as if it were some tat on a television shopping channel. Across Europe, from Germany to Greece, including Scotland, university education is considered a public good and is either free or cheap to students. Graduates in England, however, are lumbered with some of the highest student debt in the world.

Yet shove more and more students through university and into the workforce and – hey presto! – the wage premium they command will inevitably drop. Research shows that male graduates of 23 universities still earn less on average than non-graduates a whole decade after going into the workforce.

Britain manufactures graduates by the tonne, but it doesn’t produce nearly enough graduate-level jobs. Nearly half of all graduates languish in jobs that don’t require graduate skills, according to the Chartered Institute of Personnel and Development. In 1979, only 3.5% of new bank and post office clerks had a degree; today it is 35% – to do a job that often pays little more than the minimum wage.

Promise number two was that expanding higher education would break down class barriers. Wrong again. At the top universities that serve as gatekeepers to the top jobs, Oxbridge, Durham, Imperial and others, private school pupils comprise anywhere up to 40% of the intake. Yet only 7% of children go to private school.Factor in part-time and mature students, and the numbers from disadvantaged backgrounds are actually dropping. Nor does university close the class gap: Institute for Fiscal Studies research shows that even among those doing the same subject at the same university, rich students go on to earn an average of 10% more each year, every year, than those from poor families.

Far from providing opportunity for all, higher education is itself becoming a test lab for Britain’s new inequality. Consider today’s degree factory: a place where students pay dearly to be taught by some lecturer paid by the hour, commuting between three campuses, yet whose annual earnings may not amount to £9,000 a year – while a cadre of university management rake in astronomical sums.

Thus is the template set for the world of work. Can’t find an internship in politics or the media in London that pays a wage? That will cost you more than £1,000 a month in travel and rent. Want to buy your first home? In the mid-80s, 62% of adults under 35 living in the south-east owned their own home. That has now fallen to 32%. Needless to say, the best way to own your own home is to have parents rich enough to help you out.

Over the past four decades, British governments have relentlessly pushed the virtues of skilling up and getting on. Yet today wealth in Britain is so concentrated that the head of the Institute for Fiscal Studies, Paul Johnson, believes “inheritance is probably the most crucial factor in determining a person’s overall wealth since Victorian times”.

Margaret Thatcher’s acolytes promised to create a classless society, and they were quite right: Britain is instead becoming a caste society, one in which where you were born determines ever more where you end up.

For two decades, Westminster has used universities as its magic answer for social mobility. Ministers did so with the connivance of highly paid vice-chancellors, and in the process they have trashed much of what was good about British higher education. What should be sites for speculative inquiry and critical thinking have instead turned into businesses that speculate on property deals, criticise academics who aren’t publishing in the right journals – and fail spectacularly to engage with the serious social and economic problems that confront the UK right now. As for the graduates, they largely wind up taking the same place in the queue as their parents – only this time with an expensive certificate detailing their newfound expertise.

For everyone’s sake, let us declare this experiment a failure. It is high time that higher education was treated again as a public good, as Jeremy Corbyn recognises with his pledge to scrap tuition fees. But Labour also needs to expand vocational education. And if it really wants to increase social mobility and reduce unfairness, it will need to come up with tax policies fit for the age of inheritance.

Monday 17 September 2018

The limits of using GDP

Keya Acharya in The Wire.In


Most countries swear by it. It is cited by newspapers, banks and business. Almost all prominent world political leaders have used the GDP (gross domestic product) to show their countries’ well-being. Prime Minister Narendra Modi and finance minister Arun Jaitley repeatedly use India’s apparently rising GDP to point to the country’s progress and as a defence tool against criticism.

GDP measures the monetary value of goods and services produced by a country, mostly for sale in markets. Though the concept had earlier beginnings, national income and a nation’s products were first created by American Nobel laureate Simon Kuznets of the US Department of Commerce in 1934, born due to the information gaps that led to the Great Depression.

By the 1940s, wartime planning led John Maynard Keynes of the British Treasury and Henry Morgenthau Jr. of the US Treasury to go further and develop the metric of measurement we now know as GDP.

The question now is, is the concept still relevant in today’s situation? There have been criticisms for decades, from prominent economists and academics, that GDP is inadequate in measuring development, not least of all by Nobel laureate Joseph Stiglitz together with Amartya Sen and Jean-Paul Fitoussi in their 2010 report Mismeasuring our Lives: Why GDP Doesn’t Add Up.

Stiglitz, Sen et al say that statistical concepts in GDP may be correct, but the system is fundamentally flawed in that is does not measure a country’s income distribution or the well-being of its citizens. They take the case of traffic jams (page 3 of their book’s summary) as an example: GDP may rise because of increased sale of cars and gasoline but does not take into account the impact of the overuse of these on the quality of life.

The case of Delhi’s air pollution, and its major connection to its use of diesel could well be an example for us. Six years ago, a World Bank report put India’s costs of air pollution and environmental destruction at $80 billion per year; the costs could well have increased in the intervening years. Stiglitz, Sen themselves have said that statistical measures which ignore air pollution will be an inaccurate estimate of citizen’s well-being.

Indeed, even Simon Kuznets, the original founder, had said over fifty years ago that to assess a nation’s welfare, economists need to ask not how much the economy is growing, but what is growing and for whom, points out Canadian political scientist Ronald Colman (co-architect of Bhutan’s Gross National Happiness index).

Robert Costanza of Australian National University says GDP ignores social costs, environmental degradation, income-inequality, something even the OECD’s (Organisation for Economic Co-operation and Development) head of national accounts, Francois Lequiller concurs.

The WEF has a new term called inclusive development index, to measure a country’s progress. In January 2018, India ranked 62nd out of 74 emerging economies in its development index, beaten by Sri Lanka, Nepal and Pakistan in its region for development progress.

Colman outlines the enormous failure of the GDP to account for the accelerating trends of resource depletion, species extinctions and increasing greenhouse gas emissions. The last 12 years have been the hottest in millennia; sea-levels will rise by a metre by 2100; forests have been decimated and overhunted, disappearing by 1% per year whilst 40% of the world’s tropical forests have already disappeared, he says. The impacts of these existing threats do not reflect in the GDP.

And yet, in spite of this wide array of prominent criticism by noted scholars, an alternative index of economic and overall well-being has not become mainstream. Stiglitz and Sen’s economic critique was commissioned by French President Nicholas Sarkozy in 2009; yet the 2015 Paris Agreement, signed in France and deemed a milestone in the global agreement on climate change mitigation measures by 195 countries, has no inclusion of anything that offers an alternative GDP system.

At an international gathering of journalists in Italy, late November 2017, which saw a panel of economic experts from around the world discussing alternative GDP issues, I asked American physicist Fritjof Capra, director of the Centre for Ecoliteracy at Berkeley, US, why there was such a gaping lack of the inclusion of alternative GDP measures in the Paris Agreement. Capra believed that the lack of civil society participation in this particular field was a major reason for its absence. Costanza said that the habit was hard to kick, equating the GDP system to an ‘addiction’, difficult to erase.

Colman believes the fundamental reason for an alternative measurement system not finding its rightful place is that it ‘threatens the short-term economic base’: “This is unpalatable in the political arena; who is willing to challenge this?” he asks. He does agree that civil society needs to be far more engaged to displace GDP as fundamental to measuring a country’s progress.

Costanza has looked at the UN’s 17 Sustainable Development Goals (SDGs) as an alternative system. The SDGs however, are not compulsory policy practice, merely a persuasion for nations to follow. They are also complex in their interrelatedness, making it all the more difficult to present as a binding guideline. Integrating some of these development measures into the current GDP system is not possible, says Colman.

The complexity is indeed enormous, which is one reason for there not being any unity amongst economists in pushing what should be a crucial system for gauging development.

Obviously then, we need to make ecological and development economics a compulsory, system for nations to follow. Some have already done it (New Zealand, Bhutan, UK; China has re-started green growth research). It needs political will and push.

Governments might well find their own interests served in moving to an alternative GDP and striking out on a new path.

What is your brand of atheism?

Arvind Sharma in The Wire.In

The modern world is nothing if not plural in the number of possible world views it offers in terms of religions, creeds and ideologies. The profusion can be quite perplexing, even bewildering. And atheism too is an important component of the cocktail.

The book under review – John Gray’s Seven Types of Atheism – acts like a ‘guide to the perplexed’ in the modern Western world by bestowing the same kind of critical attention to atheism as theologians do to theism, and historians of religion do to the world religions.






In doing so, it identifies seven types of atheism: (1) new atheism, or an atheism which is simply interested in discrediting religion; (2) ‘secular atheism’, better described as secular humanism, which seeks salvation of the world within the world through progress; (3) ‘scientific atheism’, which turns science into a religion – a category in which the author includes ‘evolutionary humanism, Mesmerism, dialectical materialism, and contemporary transhumanism’; (4) ‘political atheism’, a category in which fall what the author considers to be modern political religions such as Jacobinism, Communism, Nazism and contemporary evangelical liberalism; (5) ‘antitheistic atheism’ or misotheism, the kind of atheism characterized by hatred of God of such people as Marquis de Sade, Dostoevsky’s character Ivan Karamazov (in a famous novel) and William Empson; (6) ‘non-humanistic atheism’, of the kind associated with the positions of George Santayana and Joseph Conrad who rejected the idea of a creator God but did not go on to cultivate benevolence towards humanity, so characteristic of secular atheism; and (7) mystical atheism, associated with the names of Schopenhauer, Spinoza, and the Russian thinker Leo Shestov. The author states his position in relation to these seven types candidly; he is repelled (his word) by the first five but feels drawn to the last two.

The seminal insight of the book, in the Western context, is that according “contemporary atheism is a continuation of monotheism by other means”. The author returns to the point again and again so that this insight enables us to examine both religion and atheism in tandem. It is thus an admirable book on atheism in the Western world and is strewn with nuggets such as:

“Scientific inquiry answers a demand for explanation. The practice of religion expresses a need for meaning…”;

“The human mind is programmed for survival, not truth”;

“Science can never close the gap between fact and value”;

“The fundamental conflict in ethics is not between self-interest and general welfare but between general welfare and desires of the moment”;

“It is not only the assertion that ‘moral’ values must take precedence over all others that has been inherited from Christianity. So has the belief that all human beings must live by the same morality”;

“… beliefs that have depended on falsehood need not themselves be false”;
“Some values may be humanly universal – being tortured or persecuted is bad for all human beings. But universal values do not make a universal morality, for these values often conflict with each other”;

“Liberal societies are not templates of a universal political order but instances of a particular form of life. Yet liberals persist in imagining that only ignorance prevents their gospel from being accepted by all of humankind – a vision inherited from Christianity”;
“Causing others to suffer could produce an excitement far beyond any achieved through mere debauchery”;

“Prayer is no less natural than sex, virtue as much as vice”;

“Continuing progress is possible only in technology and the mechanical arts. Progress in this sense may well accelerate as the quality of civilisation declines”;

“Any prospect of a worthwhile life without illusions might itself be an illusion”;

“If Nietzsche shouted the death of God from the rooftops, Arthur Schopenhauer gave the Deity a quiet burial”;

“The liberated individual entered into a realm where the will is silent”;
“Human life… is purposeless striving… But from another point of view this aimless world is pure play”;

“If the human mind mirrors the cosmos, it may be because they are both fundamentally chaotic”; and so on. 

Its provocative ideas and brilliant summaries notwithstanding, the book is bound by a limitation; its scope is limited to the West. The author does touch on Buddhism and even Sankhya but only as they have implications for the West; he does not cover Asian ideas of atheism alongside the Western. Neither Confucianism nor Daoism are hung up on a creator god and thus seem to demand attention, if atheism is defined as “the idea of the absence of a creator-god”. Similarly, in Hindu theism, the relation between the universe and the ultimate reality is posited as ontological rather than cosmological.

The concept of atheism also needs to be refined further in relation to Indian religions. In this context it is best to speak of the nontheism of Buddhism (which denies a creator god but not gods as such), and the transtheism of Advaita Vedanta (which accepts a God-like reality but denies it the status of the ultimate reality). In fact, the discussion in this book is perhaps better understood if we invoke some other categories related to the idea of God, such as transcendence and immanence. God is understood as transcendent in the Abrahamic traditions. God no doubt creates the universe but also transcends it; in the Hindu traditions, god is considered both transcendent and immanent – God ‘creates’ the universe and transcends it but also pervades it, just as the number seven transcends the number five but also contains it.

The many atheisms described in the book are really cases of denying the transcendence of god as the ultimate reality and identifying ultimate reality with something immanent in the universe. This enables one to see the atheisms of the West in an even broader light than when described as crypto-monotheisms.

One may conclude the discussion of such a heavy topic on a lighter note. Could one not think of something which is best called ‘devout agnosticism’ as a solution to rampant atheism in the West, if atheism is perceived as a problem? Such would be the situation if one prayed to a God, whose existence had been bracketed by one.

Crying for help from such a God in an emergency, is like crying for help in a less dire situation in which one shouts for help without knowing whether there is any one within earshot. Even the communists in Kerala might have found this possibility useful if the torrential rains filled them with the ‘fear of God’.

Saturday 15 September 2018

The myth of freedom

Yuval Noah Harari in The Guardian


Should scholars serve the truth, even at the cost of social harmony? Should you expose a fiction even if that fiction sustains the social order? In writing my latest book, 21 Lessons for the 21st Century, I had to struggle with this dilemma with regard to liberalism.

On the one hand, I believe that the liberal story is flawed, that it does not tell the truth about humanity, and that in order to survive and flourish in the 21st century we need to go beyond it. On the other hand, at present the liberal story is still fundamental to the functioning of the global order. What’s more, liberalism is now attacked by religious and nationalist fanatics who believe in nostalgic fantasies that are far more dangerous and harmful. 

So should I speak my mind openly, risking that my words could be taken out of context and used by demagogues and autocrats to further attack the liberal order? Or should I censor myself? It is a mark of illiberal regimes that they make free speech more difficult even outside their borders. Due to the spread of such regimes, it is becoming increasingly dangerous to think critically about the future of our species.

I eventually chose free discussion over self-censorship, thanks to my belief both in the strength of liberal democracy and in the necessity to revamp it. Liberalism’s great advantage over other ideologies is that it is flexible and undogmatic. It can sustain criticism better than any other social order. Indeed, it is the only social order that allows people to question even its own foundations. Liberalism has already survived three big crises – the first world war, the fascist challenge in the 1930s, and the communist challenge in the 1950s-70s. If you think liberalism is in trouble now, just remember how much worse things were in 1918, 1938 or 1968.


The main challenge liberalism faces today comes not from fascism or communism but from the laboratories


In 1968, liberal democracies seemed to be an endangered species, and even within their own borders they were rocked by riots, assassinations, terrorist attacks and fierce ideological battles. If you happened to be amid the riots in Washington on the day after Martin Luther King was assassinated, or in Paris in May 1968, or at the Democratic party’s convention in Chicago in August 1968, you might well have thought that the end was near. While Washington, Paris and Chicago were descending into chaos, Moscow and Leningrad were tranquil, and the Soviet system seemed destined to endure for ever. Yet 20 years later it was the Soviet system that collapsed. The clashes of the 1960s strengthened liberal democracy, while the stifling climate in the Soviet bloc presaged its demise.

So we hope liberalism can reinvent itself yet again. But the main challenge it faces today comes not from fascism or communism, and not even from the demagogues and autocrats that are spreading everywhere like frogs after the rains. This time the main challenge emerges from the laboratories.

Liberalism is founded on the belief in human liberty. Unlike rats and monkeys, human beings are supposed to have “free will”. This is what makes human feelings and human choices the ultimate moral and political authority in the world. Liberalism tells us that the voter knows best, that the customer is always right, and that we should think for ourselves and follow our hearts.



Unfortunately, “free will” isn’t a scientific reality. It is a myth inherited from Christian theology. Theologians developed the idea of “free will” to explain why God is right to punish sinners for their bad choices and reward saints for their good choices. If our choices aren’t made freely, why should God punish or reward us for them? According to the theologians, it is reasonable for God to do so, because our choices reflect the free will of our eternal souls, which are independent of all physical and biological constraints.

This myth has little to do with what science now teaches us about Homo sapiens and other animals. Humans certainly have a will – but it isn’t free. You cannot decide what desires you have. You don’t decide to be introvert or extrovert, easy-going or anxious, gay or straight. Humans make choices – but they are never independent choices. Every choice depends on a lot of biological, social and personal conditions that you cannot determine for yourself. I can choose what to eat, whom to marry and whom to vote for, but these choices are determined in part by my genes, my biochemistry, my gender, my family background, my national culture, etc – and I didn’t choose which genes or family to have.

 
Hacked … biometric sensors could allow corporations direct access to your inner world. Photograph: Alamy Stock Photo

This is not abstract theory. You can witness this easily. Just observe the next thought that pops up in your mind. Where did it come from? Did you freely choose to think it? Obviously not. If you carefully observe your own mind, you come to realise that you have little control of what’s going on there, and you are not choosing freely what to think, what to feel, and what to want.

Though “free will” was always a myth, in previous centuries it was a helpful one. It emboldened people who had to fight against the Inquisition, the divine right of kings, the KGB and the KKK. The myth also carried few costs. In 1776 or 1945 there was relatively little harm in believing that your feelings and choices were the product of some “free will” rather than the result of biochemistry and neurology.

But now the belief in “free will” suddenly becomes dangerous. If governments and corporations succeed in hacking the human animal, the easiest people to manipulate will be those who believe in free will.

In order to successfully hack humans, you need two things: a good understanding of biology, and a lot of computing power. The Inquisition and the KGB lacked this knowledge and power. But soon, corporations and governments might have both, and once they can hack you, they can not only predict your choices, but also reengineer your feelings. To do so, corporations and governments will not need to know you perfectly. That is impossible. They will just have to know you a little better than you know yourself. And that is not impossible, because most people don’t know themselves very well.

If you believe in the traditional liberal story, you will be tempted simply to dismiss this challenge. “No, it will never happen. Nobody will ever manage to hack the human spirit, because there is something there that goes far beyond genes, neurons and algorithms. Nobody could successfully predict and manipulate my choices, because my choices reflect my free will.” Unfortunately, dismissing the challenge won’t make it go away. It will just make you more vulnerable to it.

It starts with simple things. As you surf the internet, a headline catches your eye: “Immigrant gang rapes local women”. You click on it. At exactly the same moment, your neighbour is surfing the internet too, and a different headline catches her eye: “Trump prepares nuclear strike on Iran”. She clicks on it. Both headlines are fake news stories, generated perhaps by Russian trolls, or by a website keen on increasing traffic to boost its ad revenues. Both you and your neighbour feel that you clicked on these headlines out of your free will. But in fact you have been hacked.


If governments succeed in hacking the human animal, the easiest people to manipulate will be those who believe in free will

Propaganda and manipulation are nothing new, of course. But whereas in the past they worked like carpet bombing, now they are becoming precision-guided munitions. When Hitler gave a speech on the radio, he aimed at the lowest common denominator, because he couldn’t tailor his message to the unique weaknesses of individual brains. Now it has become possible to do exactly that. An algorithm can tell that you already have a bias against immigrants, while your neighbour already dislikes Trump, which is why you see one headline while your neighbour sees an altogether different one. In recent years some of the smartest people in the world have worked on hacking the human brain in order to make you click on ads and sell you stuff. Now these methods are being used to sell you politicians and ideologies, too.

And this is just the beginning. At present, the hackers rely on analysing signals and actions in the outside world: the products you buy, the places you visit, the words you search for online. Yet within a few years biometric sensors could give hackers direct access to your inner world, and they could observe what’s going on inside your heart. Not the metaphorical heart beloved by liberal fantasies, but rather the muscular pump that regulates your blood pressure and much of your brain activity. The hackers could then correlate your heart rate with your credit card data, and your blood pressure with your search history. What would the Inquisition and the KGB have done with biometric bracelets that constantly monitor your moods and affections? Stay tuned.

Liberalism has developed an impressive arsenal of arguments and institutions to defend individual freedoms against external attacks from oppressive governments and bigoted religions, but it is unprepared for a situation when individual freedom is subverted from within, and when the very concepts of “individual” and “freedom” no longer make much sense. In order to survive and prosper in the 21st century, we need to leave behind the naive view of humans as free individuals – a view inherited from Christian theology as much as from the modern Enlightenment – and come to terms with what humans really are: hackable animals. We need to know ourselves better. 

Of course, this is hardly new advice. From ancient times, sages and saints repeatedly advised people to “know thyself”. Yet in the days of Socrates, the Buddha and Confucius, you didn’t have real competition. If you neglected to know yourself, you were still a black box to the rest of humanity. In contrast, you now have competition. As you read these lines, governments and corporations are striving to hack you. If they get to know you better than you know yourself, they can then sell you anything they want – be it a product or a politician.

It is particularly important to get to know your weaknesses. They are the main tools of those who try to hack you. Computers are hacked through pre-existing faulty code lines. Humans are hacked through pre-existing fears, hatreds, biases and cravings. Hackers cannot create fear or hatred out of nothing. But when they discover what people already fear and hate it is easy to push the relevant emotional buttons and provoke even greater fury.

If people cannot get to know themselves by their own efforts, perhaps the same technology the hackers use can be turned around and serve to protect us. Just as your computer has an antivirus program that screens for malware, maybe we need an antivirus for the brain. Your AI sidekick will learn by experience that you have a particular weakness – whether for funny cat videos or for infuriating Trump stories – and would block them on your behalf.


You feel that you clicked on these headlines out of your free will, but in fact you have been hacked. Photograph: Getty images

But all this is really just a side issue. If humans are hackable animals, and if our choices and opinions don’t reflect our free will, what should the point of politics be? For 300 years, liberal ideals inspired a political project that aimed to give as many individuals as possible the ability to pursue their dreams and fulfil their desires. We are now closer than ever to realising this aim – but we are also closer than ever to realising that this has all been based on an illusion. The very same technologies that we have invented to help individuals pursue their dreams also make it possible to re-engineer those dreams. So how can I trust any of my dreams?

From one perspective, this discovery gives humans an entirely new kind of freedom. Previously, we identified very strongly with our desires, and sought the freedom to realise them. Whenever any thought appeared in the mind, we rushed to do its bidding. We spent our days running around like crazy, carried by a furious rollercoaster of thoughts, feelings and desires, which we mistakenly believed represented our free will. What happens if we stop identifying with this rollercoaster? What happens when we carefully observe the next thought that pops up in our mind and ask: “Where did that come from?”

For starters, realising that our thoughts and desires don’t reflect our free will can help us become less obsessive about them. If I see myself as an entirely free agent, choosing my desires in complete independence from the world, it creates a barrier between me and all other entities. I don’t really need any of those other entities – I am independent. It simultaneously bestows enormous importance on my every whim – after all, I chose this particular desire out of all possible desires in the universe. Once we give so much importance to our desires, we naturally try to control and shape the whole world according to them. We wage wars, cut down forests and unbalance the entire ecosystem in pursuit of our whims. But if we understood that our desires are not the outcome of free choice, we would hopefully be less preoccupied with them, and would also feel more connected to the rest of the world.


If we understood that our desires are not the outcome of free choice, we would hopefully be less preoccupied with them

People sometimes imagine that if we renounce our belief in “free will”, we will become completely apathetic, and just curl up in some corner and starve to death. In fact, renouncing this illusion can have two opposite effects: first, it can create a far stronger link with the rest of the world, and make you more attentive to your environment and to the needs and wishes of others. It is like when you have a conversation with someone. If you focus on what you want to say, you hardly really listen. You just wait for the opportunity to give the other person a piece of your mind. But when you put your own thoughts aside, you can suddenly hear other people.

Second, renouncing the myth of free will can kindle a profound curiosity. If you strongly identify with the thoughts and desires that emerge in your mind, you don’t need to make much effort to get to know yourself. You think you already know exactly who you are. But once you realise “Hi, this isn’t me. This is just some changing biochemical phenomenon!” then you also realise you have no idea who – or what – you actually are. This can be the beginning of the most exciting journey of discovery any human can undertake.



There is nothing new about doubting free will or about exploring the true nature of humanity. We humans have had this discussion a thousand times before. But we never had the technology before. And the technology changes everything. Ancient problems of philosophy are now becoming practical problems of engineering and politics. And while philosophers are very patient people – they can argue about something inconclusively for 3,000 years – engineers are far less patient. Politicians are the least patient of all.

How does liberal democracy function in an era when governments and corporations can hack humans? What’s left of the beliefs that “the voter knows best” and “the customer is always right”? How do you live when you realise that you are a hackable animal, that your heart might be a government agent, that your amygdala might be working for Putin, and that the next thought that emerges in your mind might well be the result of some algorithm that knows you better than you know yourself? These are the most interesting questions humanity now faces.

Unfortunately, these are not the questions most humans ask. Instead of exploring what awaits us beyond the illusion of “free will”, people all over the world are now retreating to find shelter with even older illusions. Instead of confronting the challenge of AI and bioengineering, many are turning to religious and nationalist fantasies that are even less in touch with the scientific realities of our time than liberalism. Instead of fresh political models, what’s on offer are repackaged leftovers from the 20th century or even the middle ages.

When you try to engage with these nostalgic fantasies, you find yourself debating such thingsas the veracity of the Bible and the sanctity of the nation (especially if you happen, like me, to live in a place like Israel). As a scholar, this is a disappointment. Arguing about the Bible was hot stuff in the age of Voltaire, and debating the merits of nationalism was cutting-edge philosophy a century ago – but in 2018 it seems a terrible waste of time. AI and bioengineering are about to change the course of evolution itself, and we have just a few decades to figure out what to do with them. I don’t know where the answers will come from, but they are definitely not coming from a collection of stories written thousands of years ago.

So what to do? We need to fight on two fronts simultaneously. We should defend liberal democracy, not only because it has proved to be a more benign form of government than any of its alternatives, but also because it places the fewest limitations on debating the future of humanity. At the same time, we need to question the traditional assumptions of liberalism, and develop a new political project that is better in line with the scientific realities and technological powers of the 21st century.

Greek mythology tells that Zeus and Poseidon, two of the greatest gods, competed for the hand of the goddess Thetis. But when they heard the prophecy that Thetis would bear a son more powerful than his father, both withdrew in alarm. Since gods plan on sticking around for ever, they don’t want a more powerful offspring to compete with them. So Thetis married a mortal, King Peleus, and gave birth to Achilles. Mortals do like their children to outshine them. This myth might teach us something important. Autocrats who plan to rule in perpetuity don’t like to encourage the birth of ideas that might displace them. But liberal democracies inspire the creation of new visions, even at the price of questioning their own foundations.