Search This Blog

Showing posts with label complexity. Show all posts
Showing posts with label complexity. Show all posts

Friday 21 January 2022

Pakistan: Towards a modern Riyasat-e-Madina

Nadeem F Paracha in The Friday Times

On January 17, an article written by PM Imran Khan appeared in some English and Urdu dailies. In it, the Pakistani prime minister shared his thoughts on ‘Riyasat-e-Madinah,’ or the first ‘Islamic state’ that came into being in early 7th-century Arabia. PM Khan wrote that Pakistan will need to adopt the moral and spiritual tenor of that state if the country was to thrive.

Even before he came to power in 2018, Khan had been promising to turn Pakistan into a modern-day Riyasat-e-Madinah. He first began to formulate this as a political message in 2011. However, this idea is not a new one. It has been posited previously as well by some politicians, and especially, by certain Islamist ideologues. Neither is there any newness in the process used by Khan to arrive at this idea. Khan took the same route as ZA Bhutto’s Pakistan People’s Party (PPP) took decades ago. In 1967, when the PPP was formed, its ‘foundation documents’ — authored by Bhutto and the Marxist intellectual J.A. Rahim — described the party as a socialist entity. To neutralise the expected criticism from Islamist groups, the documents declared that democracy was the party’s policy, socialism was its economy, and Islam was its faith.

The documents then added that by “socialism” the party meant the kind of democratic-socialism practiced in Scandinavian countries such as Norway, Sweden, Denmark and Finland; and through which these countries had constructed robust welfare states. But this did not impress Islamist outfits, especially the Jamat-i-Islami (JI). It declared the PPP as a party of ‘atheists.’ In 1969, JI’s chief Abul Ala Maududi, authored a fatwa declaring socialism as an atheistic idea. The next year, when the PPP drafted its first ever manifesto, the party explained that its aim to strive for democracy, a “classless society,” economic equality and social justice “flows from the the political and economic ethics of Islam.”

After coming to power in December 1971, the PPP began using the term “Musawat-e-Muhammadi” (social and economic equality preached and practiced by Islam’s holy Prophet [PBUH]). In 1973, a prominent member of the PPP, Sheikh Ahmed Rashid, declared that the economic system that Islam advocated and the one that was implemented in the earliest state of Islam was socialist. When a parliament member belonging to an Islamist party demanded that Islamic rituals be made compulsory by law “because Pakistan was made in the name of Islam,” Rashid responded by saying that the country was not made to implement rituals, but to adopt an “Islamic economy” which was “inherently socialist.”

Now let us see just how close all this is to the route that PM Khan and his Pakistan Tehreek-i-Insaf (PTI) took in formulating their concept of Riyasat-i-Madinah. In 2011, PTI and Khan suddenly rose to prominence as a party of urban middle-classes and the youth. In his speeches between 2011 and 2015, Khan was quite vocal in his appreciation of the Scandinavian welfare states. But, often, this appreciation was immediately followed by Khan declaring that the non-Muslim Scandinavians had uncannily followed Islamic ideals of social justice and economic equality better than the Muslims had (or do). Of course, he did not mention that Scandinavian countries are some of the most secular nation-states in the world, and that a strong secular-humanist disposition of their polities and politics played a major role in the construction of the welfare states that Khan was in such awe of.

As the 2018 elections drew near, Khan began to explain the concept of the European welfare state as a modern-day reflection of the 7th-century state that was formed in the city of Madinah. This notion was close to Bhutto’s Musawat-e-Muhamadi. But Bhutto and his PPP had claimed that the Islamic state in Madinah had a socialist economy, and that this alone should be adopted by Pakistan, because it was still relevant in the 20th century. This position had given the PPP enough space to remain secular in most other areas. But to Khan, if the Scandinavian model of the welfare state is adopted, and then supplemented by Islam’s moral, spiritual and political ethos in all fields and areas, this would result in the modern-day re-enactment of a 7th-century ‘Islamic state.’ Khan’s idea in this this context is thus more theocratic in nature.

Khan’s concept seemed to be emerging from how Pakistan was imagined by some pro-Jinnah ulema during the 1946 elections in British India. To Mr. Jinnah’s party, the All India Muslim League (AIML), the culture of Indian Muslims largely mirrored the culture of Muslims outside South Asia, particularly in Arabia and even Persia. But the politics and economics of India’s Muslim were grounded in India and/or in the territory that they had settled in 500 years ago. Therefore, the Muslim-majority state that the League was looking to create was to be established in this territory. The League’s Muslim nationalism was thus territorial. It was not to be a universal caliphate or a theocracy with imperial and expansionist aims. It was to be a sovereign political enclave in South Asia where the Muslim minority of India would become a majority, thus benefiting from the economic advantages of majoritarianism.

However, whereas this narrative – more or less – worked in attracting the votes of the Muslims of Bengal and Sindh during the 1946 polls, the League found itself struggling in Punjab, which was a bastion of the multicultural Union Party. The Congress, too, was strong here. Various radical Islamist groups were also headquartered in Punjab. They had rejected the League’s call for a separate country. They believed that it would turn the remaining Muslims in India into an even more vulnerable minority. The Islamists viewed the League as a secular outfit with westernised notions of nationalism and an impious leadership.

This is when some ulema switched sides and decided to support the League in Punjab. This is also when overt Islamist rhetoric was, for the first time, used by the League through these ulema, mainly in Punjab’s rural areas. The ulema began to portray Jinnah as a ‘holy figure,’ even though very few rural Punjabis had actually seen him. The well-known Islamic scholar Shabbir Ahmed Usmani, who left the anti-Jinnah Jamiat Ulema Islam Hind (JUIH) to support the League, began to explain the yet-to-be-born Pakistan as a “naya Madinah,” or new Madinah.

By this, Usmani meant the creation of a state that would be based on the model of the 7th century state in Madina. But, much to the disappointed of the pro-League ulema, the model adopted by Pakistan was largely secular and the Islam that the state espoused was carved from the ideas of ‘Muslim modernists’ such as the reformer Sir Syed Ahmad Khan (d.1898) and the poet-philosopher Muhammad Iqbal (d.1938), who urged Muslims to look forward with the aid of an evolved and rational understanding of Islam, instead of looking backwards to a romanticised past.

***

 
Khan often spoke about Riyasat-i-Madinah before he became PM. But the frequency of him doing so has increased in the last year and a half – or when his government began to truly unravel. Today, it is in deep crises and expected to either be eased out by the Parliament, or knocked out in a ruder manner before it completes its term in 2023. The economy is in shambles, inflation and unemployment rates are climbing, and so are tensions between the government and its erstwhile patrons, the military establishment.

Amidst the growing crises, Khan has spoken more about morality, ‘westernisation,’ and Islamophobia than on how his government is planning to address the mounting economic problems that the country is facing, and the consequential political quagmire that his government has plunged into. Yet, he still somehow found a reason – or for that matter, the audacity – to lecture the polity on the moral and spiritual principles of Riyasat-i-Madinah, or the kind of morally upright and pious state and society that he is dreaming of constructing. One wonders if he is planning to do this with the large IMF loans that his government has had to acquire to keep the country from going bankrupt?

Khan’s article on Riyasat-i-Madinah was censured by the opposition parties. They saw it as a political ploy by him to distract the people from the failures of his government. There is evidence that a PR company hired by Khan has been advising him to raise the frequency of his Islamic rhetoric. The purpose behind this could be what the opposition is claiming. It might also be about something personal. But for men such as Khan, the personal often becomes the political.

According to political scientist David O’Connell, it is crisis, not political convenience, that more often brings out religion in politicians. In his book God Wills It, O’Connell argues that when public opinion of political leaders begins to dwindle, or when a head of state or government is threatened, that is when one sees religious rhetoric appear.

In 1976, when the Bhutto regime was struggling to address economic problems caused by an international oil crisis that had pushed up inflation, and due to the regime’s own mismanagement of important economic sectors that it had nationalised, Bhutto decided to organise a grand ‘Seerat Conference’ in Karachi. The conference was organised to discuss and highlight the life and deeds of Islam’s Prophet (PBUH) and how these could be adopted to regenerate the lost glory of the Islamic civilisation. Khan did exactly the same, late last year.

Bhutto’s intended audience in this respect was the Islamists who he had uncannily emboldened by agreeing to their demand of constitutionally ousting the Ahmadiyya community from the fold of Islam. He believed that this would neutralise the threat that the Islamists were posing to his ‘socialist’ government. The demand had risen when the Islamist groups in the Parliament had asked the government to provide the constitution with the provision to define what or who was a Muslim. In 1973, the government had refused to add any such provision in the constitution. But the very next year in 1974, when a clash between a group of Ahmadiyya youth and cadres of the student-wing of JI caused outrage amongst Islamist parties, they tabled a bill in the National Assembly which sought to constitutionally declare the Ahmadiyya as a non-Muslim community.

Bhutto threatened to unleash the military against anti-Ahmadiyya agitators who had besieged various cities of Punjab. According to Rafi Raza, who, at the time, was a special assistant to the prime minister, Bhutto insisted that the Parliament was no place to discuss theological matters. In his book ZA Bhutto and Pakistan, Raza wrote that the Islamist parties retorted by reminding the PM that in 1973 the constitution had declared Pakistan an ‘Islamic republic’ – and therefore, parliamentarians in an Islamic republic had every right to discuss religious matters.

After much violence in Punjab and commotion in the National Assembly, Bhutto capitulated and allowed the bill to be passed. This also meant that parliamentarians now had the constitutional prerogative to define who was or wasn’t a Muslim. This would eventually lead to the 1985 amendments in Articles 62 and 63 of the constitution, proclaiming that only ‘pious’ Muslims can be members of the Parliament and heads of state and government. The man who had initiated this, the dictator Zia-ul-Haq, had already declared himself ‘Sadiq and Amin’ (honest and faithful).

In 1976, Bhutto’s Islamist opponents were deriding him as a ‘bad Muslim,’ because he had ‘loose morals,’ was an ‘alcoholic,’ and that his government was as bad at fixing the economy as it was in curbing the “rising trend of obscenity and immorality in the society.” So, with the Seerat Conference, Bhutto set out to exhibit his Islamic credentials and, perhaps, to also demonstrate that his regime may be struggling to fix the economy, but, at least, it was being headed by a ‘true believer.’ But this didn’t save him from being toppled in a military coup that was triggered by his opponents who, in 1977, had poured out to agitate and demand a government based on Shariah laws.

Khan is on a similar path. He had been ‘reforming’ himself ever since he retired in 1992 as a cricketing star, a darling of tabloid press, and a ‘playboy.’ From a lifestyle liberal who had spent much of his time stationed in the UK, playing cricket and hobnobbing with European and American socialites, he gradually began to refigure his image. After retirement from cricket at age 40, he was mostly seen with prominent military men such as General Hamid Gul, who had once been extremely close to the dictator Zia-ul-Haq.

Khan also began to have one-on-one meetings with certain ulema and Islamic evangelical groups. Khan’s aim was to bury his colourful past and re-emerge as an incorruptible born-again Muslim. But his past was not that easy to get rid of. It kept being brought up by the tabloids and also by Nawaz Sharif’s centre-right PML-N, which began to see him as a threat because Khan was trying to appeal to Sharif’s constituency. Sharif was a conservative and a protégé of Zia. In 1998, his second regime was struggling to fix an economy that had begun to spiral down after Pakistan conducted its first nuclear tests. This triggered economic sanctions against Pakistan, imposed by its major donors and trading partners, the US and European countries.

The crisis saw Nawaz formulate his own ‘Islamic’ shenanigans. His crusades against obscenity were coupled by his desire to be declared ‘the commander of the faithful’ (amir-ul-mominin). Instead, he was brought down by a coup in 1999. Unlike the coup against the Bhutto regime which was planed by a reactionary general, the one against Nawaz (by General Musharraf) was apparently staged to fix the economy and roll back the influence that the Islamists had enjoyed, especially during the Zia and Nawaz regimes.

Khan’s party initially supported the coup against Nawaz. But it pulled back its support when the PTI was routed in the 2002 elections. Khan began criticising Musharraf as an “American stooge” and “fake liberal.” Musharraf responded by claiming that Khan had asked him that he be made the prime minister. Musharraf then added that Khan’s ideas were “like those of a mullah.” One wonders whether this statement had annoyed Khan or delighted him. Because remember, he was trying his best to bury his glitzy past and convince everyone that he was now a pious gentleman who wanted to employ ‘true Islamic principles’ in the country’s politics and polity.

---

But weren’t many of these ‘principles’ already made part of the country’s constitution and penal code by the likes of ZA Bhutto, Zia and Nawaz?

From 1974 onwards, Pakistan started to become what the Canadian political scientist Ran Hirschl described as a “constitutional theocracy.” The phrase was initially coined by the French political scientist Oliver Roy for Iran’s post-revolution constitution. Hirschl expanded it in a 2010 book in which he studied the increasing Islamisation of constitutions in certain Muslim countries, and the problems these constitutions were facing in coming to terms with various contemporary political, legislative and social challenges.

Constitutional theocracies empower the Islamists even if they are a minority in the Parliament. This is quite apparent in Pakistan. According to Syed Adnan Hussain, Associate Professor of Religious Studies at Saint Mary’s University, even though most Islamists scoff at democracy, there were also some prominent Islamist ideologues who posited that constitutions, judicial review, legal codes and a form of democratic election could be integrated into an Islamic state. Abul Ala Maududi and Maulana Taqi Usmani were two such ideologues. They agreed to use whatever means were available to turn Pakistan into an Islamic state. And these included democratic institutions, processes and the constitution.

Involvement of the ulema in drafting the 1956 constitution was nominal, even though the constitution did declare the country an Islamic republic. Their contribution in drafting the 1962 constitution was extremely minimal. And even though, there were just 18 members of various Islamist parties in the National Assembly which came into being after 1971, their input increased during the drafting of the 1973 constitution. Their influence continued to grow. By 1991, the constitution had been greatly Islamised.

Therefore, even the more electorally strong non-Islamist parties have had to add various Islamist ideas in their armoury because as the American author Shadi Hamid wrote: “Private religious devotion (to Islamists) is inseparable from political action. Islam is to be applied in daily life, including in the public realm. And to fail to do so is to shirk one’s obligations towards God. Faith, or at least their faith, gives (the Islamists) a built-in political advantage.” It is this advantage that the non-Islamist politicians want to usurp. They frequently find themselves pressed to continue positioning themselves as equally pious champions of Islam. Khan is doing exactly that. Bhutto did so in the second half of his rule, and Nawaz during his second stint as PM. But, of course, this does not come naturally to non-Islamists. And the Islamists are never convinced. In fact, they see it as a way by non-Islamists to neutralise the political influence of the Islamists. Yet, in times of crisis, many non-Islamist heads of government in the country have curiously leaned towards religion, believing that by adopting an ‘Islamic’ demeanour, they would be able to pacify public anger towards their failing regimes.

This can be a desperate and last-ditch ploy to survive a fall. But on occasion, it can also be about a personal existentialist crisis – which makes it even worse. I believe Khan is a case in point. Indeed, there is an element of political amorality in his increasingly fervent moral rhetoric and religious exhibitionism. According to the British journalist and documentary filmmaker Adam Curtis, as the world continues to become more complicated than ever, political leaders are increasingly struggling to comprehend today’s complexities and, thus, failing to formulate and provide a coherent vision of the future. They are attempting to define the complexity of today’s realities in an overtly simple manner.

Driven by a demand to simplify modern-day complexities, the leaderships, instead of trying to figure out new ways forward, have begun to look backwards, promising to bring forth a past that was apparently better and less complicated. But the recollection of such pasts is often not very accurate, because it involves a nostalgia which is referred to as ‘Anemoia’, or a nostalgia for a time one has never known. A past that is not a lived experience. A past that is largely imagined.

Khan likes to talk about the 7th-century state in Madinah. But as the anthropologist Irfan Ahmad and the historian Patricia Crone have demonstrated, there was no clear concept of a state anywhere in pre-modern times, east or west. The idea of the state as we know it today, began to emerge after the 17th century and matured from the 19th century onwards. It is a European concept. What is more, according to Ahmad, the idea of an Islamic state is a 20th century construct. It is derived from an imagined memory. Pre-modern states were vastly different than what they became from the 19th century onwards. States in pre-modern times had extremely limited capacity or resources to regulate every aspect of life.

They were impersonal and mostly erected to collect taxes from the subjects so that landed elites and monarchs could sustain standing armies, mount their wars, and retain power. A majority of the subjects were left to their own devices, as long as they did not rebel. Conquered areas were mostly put in the hands of local leaders on the condition that they would remain loyal to the conquers. Ancient states in Muslim regions and in the regions that the Muslims conquered were no different. But 20th-century Islamic ideologues began to speak of creating Islamic states. According to Ahmad, the idea of an Islamic state was the result of how the concept of the modern state had begun to fascinate ideologues and politicians in India.

The Congress began to talk about an Indian state, the League began to work towards a Muslim-majority state, the socialists towards a socialist state, and Islamists like Maududi began musing about an Islamic state. Shabir Usmani and Maududi projected the idea and reality of an all-encompassing modern state as a way to explain the functions of the 7th-century Madinah state, as if it had functioned like a modern state, regulating the lives of its subjects with coded laws, interventions, constitutions and through other established state institutions. This was not the case. What is more, there was little or no scholarship in the premodern Muslim world on political ideas or philosophy. These would only begin to appear in the 14th century in the works of Arab scholar Ibn Khaldun.

PM Khan is thus dealing in anemoia. He, like 20th-century critics of modernity, is raging against its supposedly cold and mechanical disposition. But instead of offering something new, he is investing more effort in trying to revive romanticised pasts which did not exist in the shape that they are often remembered as. Khan’s failure and incompetence to address the mounting problems of the here and the now, and his insistence on creating a theocratic potpourri of schemes already exhausted by Islamist ideologues – and by heads of state and government such as Bhutto, Zia and Nawaz – may as well be the last nail in the coffin of a much-exploited idea that is almost entirely based on a politically motivated and largely imagined memory.

Tuesday 19 May 2015

Why I choose to have less choice

Shopping around is the mantra of the modern era. But who really benefits from our befuddlement?

Tim Lott in The Guardian

Once, when I was suffering a fit of depression, I walked into a supermarket to buy a packet of washing powder. Confronted by a shelf full of different possibilities, I stood there for 15 minutes staring at them, then walked out without buying any washing powder at all.
I still feel echoes of that sensation of helplessness. If I just want to buy one item but discover that if I buy three of the items I will save myself half the item price, I find myself assailed by choice paralysis.
I hate making consumer choices at the best of times, because I have this uncomfortable suspicion that big companies are trying to gull me out of as much money as possible, using sophisticated techniques designed by people who are smarter than I am.
For instance, when I buy an insurance product, how can I decide whether I should just buy the cheapest, or the best? The best is the one most likely to pay out without penalty or fuss, but that information is much harder to find out than factors such as cost, extent of cover, etc. It’s complicated. So I often try not to make choices – by just putting my payments for insurance with my usual insurers on direct debit, for example, which means I don’t have to think about shopping around.
This issue of choice and complexity lies at the heart of the experience of being modern. It penetrates commerce, politics and our personal lives. It may even be connected to the fact that there are higher levels of depression in society than ever before.
This idea was suggested by Barry Schwartz in his book The Paradox of Choice. Choice oppresses us. Why? Because there are too many choices and they are often too complex for us to be confident that we are making the right one.
When you might have 200 potential choices to make of a particular style of camera, it is difficult to feel sure you have chosen the right one – even if you spend an inordinate amount of time trying to make a rational decision. Or you may see the same model two weeks after you’ve bought it being sold more cheaply. When there was less choice and fewer types of camera, this kind of experience was rare. Our capacity for hindsight has become a means of punishing ourselves.
Complexity is not entirely accidental. Late capitalism solves the dilemma of competition (for the producer) through complexity. To try to choose a mortgage, or a pension, or a computer, requires a tremendous amount of application, so we become relatively easy to gull. Whether it is a power company or a loan company, we struggle to understand tariffs, terms and the small print. Exhausted, we just take a stab and hope for the best, or we succumb to inertia; choose what we have always chosen. Consumers are thrown back on simple cues that are advantageous to the producers, such as brand recognition.
Complexity also impacts on politics. Once it was pretty clear who to vote for – your class position, on the whole, made it a simple matter of self-interest for most voters. Now we have become closer to what is ironically the democratic ideal – ie choice-making actors – voting is more of a challenge than it once was. Do you really have a good enough grasp of economic theory to judge whether it is best to spend or save in a recession? Do you understand the complexities of private provision in the NHS enough to rule it out? Do you know enough about international affairs to support a reduction in defence spending, or a retreat from the EU? Most people don’t – so, again, they make snap judgments based on loyalty and sentiment.
This problem of choice and complexity is ubiquitous. It applies in medicine. If I am ill and asked to make a choice about treatment, I would often rather leave the choice to the doctor, if only because if the wrong choice is made, I am not going to feel nearly so bad about it. I had a prostate cancer scare recently, and I just wanted to be told what to do – not decide whether, say, I should choose an operation that would guarantee impotency in order to stave off a 5% chance of cancer. The burden of choice was too big.
In the field of education a similar dilemma applies. Once your child went to the local primary or secondary. Now you have to decide from a bewildering number of types of school. In the personal realm, once, you stayed married for life. Now, if you are in an unhappy marriage you have to decide whether to stay or not. These may be all positive developments, but they come at a cost – the potential for regret.
So how should one react to complexity? Schwartz suggests we should limit choice, not extend it. If you are shopping for food, go to supermarkets that are priced simply with a limited range, such as Aldi and Lidl. Recognise and accept complexity – which means accepting that you can never be sure that you’ve made the right choice.
Above all, don’t fall for the old trope of only wanting “the best”. Schwartz calls such people “maximisers” – people who are never happy, because they have expectations that can never be met, since in a world of complexity and unlimited choice there is always a better option. Be a “satisficer” instead – people who are happy to say “that’s good enough”, or “it’ll do”.
This may not work in politics – saying the Conservatives “will do” when you wanted the Green party is not very satisfactory – but as a consumer, and in life generally, it’s a pretty good formula. It’ll do, anyway.

Wednesday 10 July 2013

In today's corporations the buck never stops. Welcome to the age of irresponsibility


Our largest companies have become so complex that no one's expected to fully know what's going on. Yet the rewards are bigger than ever
Hon Hai Foxconn
Hon Ha's Foxconn plant in Shenzhen, China, in 2010. That year there were 12 suicides in the 300,000-strong workforce. 'The top managers of Apple escaped blame because these deaths happened in ­factories in another country (China) owned by a company from yet another country (Hon Hai, the Taiwanese ­multinational).' Photograph: Qilai Shen

George Osborne confirmed on Monday that he would accept the recommendation of Britain's parliamentary commission on banking standards and add to his banking reform bill a new offence of "reckless misconduct in the management of a bank".
That is a bit of a setback for the managerial class, but it still does not sufficiently change the overall picture that it is a great time to be a top manager in the corporate world, especially in the US and Britain.
Not only do they give you a good salary and handsome bonus, but they are really understanding when you fail to live up to expectations. If they want to show you the door in the middle of your term, they will give you millions of dollars, even tens of millions, in "termination payment". Even if you have totally screwed up, the worst that can happen is that they take away your knighthood or make you give up, say, a third of your multimillion-pound pension pot.
Even better, the buck never stops at your desk. It usually stops at the lowest guy in the food chain – a rogue trader or some owner of a two-bit factory in Bangladesh. Occasionally you may have to blame your main supplier, but rarely your own company, and never yourself.
Welcome to the age of irresponsibility.
The largest companies today are so complex that top managers are not even expected to know fully what is really going on in them. These companies have also increasingly outsourced activities to multiple layers of subcontractors in supply chains crisscrossing the globe.
Increasing complexity not only lowers the quality of decisions, as it creates an information overload, but makes it more difficult to pin down responsibilities. A number of recent scandals have brought home this reality.
The multiple suicides of workers in Foxconn factories in China have revealed Victorian labour conditions down the supply chains for the most futuristic Apple products. But the top managers of Apple escaped blame because these deaths happened in factories in another country (China) owned by a company from yet another country (Hon Hai, the Taiwanese multinational).
No one at the top of the big supermarkets took serious responsibility in the horsemeat scandal because, it was accepted, they could not be expected to police supply chains running from Romania through the Netherlands, Cyprus and Luxembourg to France (and that is only one of several chains involved).
The problem is even more serious in the financial sector, which these days deals in assets that involve households (in the case of mortgages), companies and governments all over the world. On top of that these financial assets are combined, sliced and diced many times over, to produce highly complicated "derivative" products. The result is an exponential increase in complexity.
Andy Haldane, executive director of financial stability at the Bank of England, once pointed out that in order to fully understand a CDO2 – one of the more complicated financial derivatives (but not the most complicated) – a prospective investor needs to absorb more than a billion pages of information. I have come across bankers who confessed that they had derivative contracts running to a few hundred pages, which they naturally didn't have time to read.
Given this level of complexity, financial companies have come to rely heavily on countless others – stock analysts, financial journalists, credit-rating agencies, you name it – for information and, more importantly, making judgments. This means that when something goes wrong, they can always blame others: poor people in Florida who bought houses they cannot afford; "irresponsible" foreign governments; misleading foreign stock analysts; and, yes, incompetent credit-rating agencies.
The result is an economic system in which no one in "responsible" positions takes any serious responsibility. Unless radical action is taken, we will see many more financial crises and corporate scandals in the years to come.
The first thing we need is to modernise our sense of crime and punishment. Most of us still instinctively subscribe to the primeval notion of crime as a direct physical act – killing someone, stealing silver. But in the modern economy, with a complex division of labour, indirect non-physical acts can also seriously harm people. If misbehaving financiers and incompetent regulators cause an economic crisis, they can indirectly kill people by subjecting them to unemployment-related stress and by reducing public health expenditure, as shown by books like The Body Politic. We need to accept the seriousness of these "long-distance crimes" and strengthen punishments for them.
More importantly, we need to simplify our economic system so that responsibilities are easier to determine. This is not to say we have to go back to the days of small workshops owned by a single capitalist: increased complexity is inevitable if we are to increase productivity. However, much of the recent rise in complexity has been designed to make money for certain people, at the cost of social productivity. Such socially unproductive complexity needs to be reduced.
Financial derivatives are the most obvious examples. Given their potential to exponentially increase the complexity of the financial system – and thus the degree of irresponsibility within it – we should only allow such products when their creators can prove their productivity and safety, similar to how the drug approval process works.
The negative potential of outsourcing in non-financial industries may not be as great as that of financial derivatives, but the buying companies should be made far more accountable for making their subcontractors comply with rules regarding product safety, working conditions and environmental standards.
Without measures to simplify the system and recalibrate our sense of crime and punishment, the age of irresponsibility will destroy us all.

Sunday 12 February 2012

The mathematical equation that caused the banks to crash

 Ian Stewart in The Observer 21-02-12

It was the holy grail of investors. The Black-Scholes equation, brainchild of economists Fischer Black and Myron Scholes, provided a rational way to price a financial contract when it still had time to run. It was like buying or selling a bet on a horse, halfway through the race. It opened up a new world of ever more complex investments, blossoming into a gigantic global industry. But when the sub-prime mortgage market turned sour, the darling of the financial markets became the Black Hole equation, sucking money out of the universe in an unending stream.

Anyone who has followed the crisis will understand that the real economy of businesses and commodities is being upstaged by complicated financial instruments known as derivatives. These are not money or goods. They are investments in investments, bets about bets. Derivatives created a booming global economy, but they also led to turbulent markets, the credit crunch, the near collapse of the banking system and the economic slump. And it was the Black-Scholes equation that opened up the world of derivatives.

The equation itself wasn't the real problem. It was useful, it was precise, and its limitations were clearly stated. It provided an industry-standard method to assess the likely value of a financial derivative. So derivatives could be traded before they matured. The formula was fine if you used it sensibly and abandoned it when market conditions weren't appropriate. The trouble was its potential for abuse. It allowed derivatives to become commodities that could be traded in their own right. The financial sector called it the Midas Formula and saw it as a recipe for making everything turn to gold. But the markets forgot how the story of King Midas ended.

Black-Scholes underpinned massive economic growth. By 2007, the international financial system was trading derivatives valued at one quadrillion dollars per year. This is 10 times the total worth, adjusted for inflation, of all products made by the world's manufacturing industries over the last century. The downside was the invention of ever-more complex financial instruments whose value and risk were increasingly opaque. So companies hired mathematically talented analysts to develop similar formulas, telling them how much those new instruments were worth and how risky they were. Then, disastrously, they forgot to ask how reliable the answers would be if market conditions changed.

Black and Scholes invented their equation in 1973; Robert Merton supplied extra justification soon after. It applies to the simplest and oldest derivatives: options. There are two main kinds. A put option gives its buyer the right to sell a commodity at a specified time for an agreed price. A call option is similar, but it confers the right to buy instead of sell. The equation provides a systematic way to calculate the value of an option before it matures. Then the option can be sold at any time. The equation was so effective that it won Merton and Scholes the 1997 Nobel prize in economics. (Black had died by then, so he was ineligible.)

If everyone knows the correct value of a derivative and they all agree, how can anyone make money? The formula requires the user to estimate several numerical quantities. But the main way to make money on derivatives is to win your bet – to buy a derivative that can later be sold at a higher price, or matures with a higher value than predicted. The winners get their profit from the losers. In any given year, between 75% and 90% of all options traders lose money. The world's banks lost hundreds of billions when the sub-prime mortgage bubble burst. In the ensuing panic, taxpayers were forced to pick up the bill, but that was politics, not mathematical economics.

The Black-Scholes equation relates the recommended price of the option to four other quantities. Three can be measured directly: time, the price of the asset upon which the option is secured and the risk-free interest rate. This is the theoretical interest that could be earned by an investment with zero risk, such as government bonds. The fourth quantity is the volatility of the asset. This is a measure of how erratically its market value changes. The equation assumes that the asset's volatility remains the same for the lifetime of the option, which need not be correct. Volatility can be estimated by statistical analysis of price movements but it can't be measured in a precise, foolproof way, and estimates may not match reality.

The idea behind many financial models goes back to Louis Bachelier in 1900, who suggested that fluctuations of the stock market can be modelled by a random process known as Brownian motion. At each instant, the price of a stock either increases or decreases, and the model assumes fixed probabilities for these events. They may be equally likely, or one may be more probable than the other. It's like someone standing on a street and repeatedly tossing a coin to decide whether to move a small step forwards or backwards, so they zigzag back and forth erratically. Their position corresponds to the price of the stock, moving up or down at random. The most important statistical features of Brownian motion are its mean and its standard deviation. The mean is the short-term average price, which typically drifts in a specific direction, up or down depending on where the market thinks the stock is going. The standard deviation can be thought of as the average amount by which the price differs from the mean, calculated using a standard statistical formula. For stock prices this is called volatility, and it measures how erratically the price fluctuates. On a graph of price against time, volatility corresponds to how jagged the zigzag movements look.

Black-Scholes implements Bachelier's vision. It does not give the value of the option (the price at which it should be sold or bought) directly. It is what mathematicians call a partial differential equation, expressing the rate of change of the price in terms of the rates at which various other quantities are changing. Fortunately, the equation can be solved to provide a specific formula for the value of a put option, with a similar formula for call options.

The early success of Black-Scholes encouraged the financial sector to develop a host of related equations aimed at different financial instruments. Conventional banks could use these equations to justify loans and trades and assess the likely profits, always keeping an eye open for potential trouble. But less conventional businesses weren't so cautious. Soon, the banks followed them into increasingly speculative ventures.

Any mathematical model of reality relies on simplifications and assumptions. The Black-Scholes equation was based on arbitrage pricing theory, in which both drift and volatility are constant. This assumption is common in financial theory, but it is often false for real markets. The equation also assumes that there are no transaction costs, no limits on short-selling and that money can always be lent and borrowed at a known, fixed, risk-free interest rate. Again, reality is often very different.
When these assumptions are valid, risk is usually low, because large stock market fluctuations should be extremely rare. But on 19 October 1987, Black Monday, the world's stock markets lost more than 20% of their value within a few hours. An event this extreme is virtually impossible under the model's assumptions. In his bestseller The Black Swan, Nassim Nicholas Taleb, an expert in mathematical finance, calls extreme events of this kind black swans. In ancient times, all known swans were white and "black swan" was widely used in the same way we now refer to a flying pig. But in 1697, the Dutch explorer Willem de Vlamingh found masses of black swans on what became known as the Swan River in Australia. So the phrase now refers to an assumption that appears to be grounded in fact, but might at any moment turn out to be wildly mistaken.

Large fluctuations in the stock market are far more common than Brownian motion predicts. The reason is unrealistic assumptions – ignoring potential black swans. But usually the model performed very well, so as time passed and confidence grew, many bankers and traders forgot the model had limitations. They used the equation as a kind of talisman, a bit of mathematical magic to protect them against criticism if anything went wrong.

Banks, hedge funds, and other speculators were soon trading complicated derivatives such as credit default swaps – likened to insuring your neighbour's house against fire – in eye-watering quantities. They were priced and considered to be assets in their own right. That meant they could be used as security for other purchases. As everything got more complicated, the models used to assess value and risk deviated ever further from reality. Somewhere underneath it all was real property, and the markets assumed that property values would keep rising for ever, making these investments risk-free.
The Black-Scholes equation has its roots in mathematical physics, where quantities are infinitely divisible, time flows continuously and variables change smoothly. Such models may not be appropriate to the world of finance. Traditional mathematical economics doesn't always match reality, either, and when it fails, it fails badly. Physicists, mathematicians and economists are therefore looking for better models.

At the forefront of these efforts is complexity science, a new branch of mathematics that models the market as a collection of individuals interacting according to specified rules. These models reveal the damaging effects of the herd instinct: market traders copy other market traders. Virtually every financial crisis in the last century has been pushed over the edge by the herd instinct. It makes everything go belly-up at the same time. If engineers took that attitude, and one bridge in the world fell down, so would all the others.

By studying ecological systems, it can be shown that instability is common in economic models, mainly because of the poor design of the financial system. The facility to transfer billions at the click of a mouse may allow ever-quicker profits, but it also makes shocks propagate faster.

Was an equation to blame for the financial crash, then? Yes and no. Black-Scholes may have contributed to the crash, but only because it was abused. In any case, the equation was just one ingredient in a rich stew of financial irresponsibility, political ineptitude, perverse incentives and lax regulation.

Despite its supposed expertise, the financial sector performs no better than random guesswork. The stock market has spent 20 years going nowhere. The system is too complex to be run on error-strewn hunches and gut feelings, but current mathematical models don't represent reality adequately. The entire system is poorly understood and dangerously unstable. The world economy desperately needs a radical overhaul and that requires more mathematics, not less. It may be rocket science, but magic it's not.
Ian Stewart is emeritus professor of mathematics at the University of Warwick.

Friday 9 September 2011

We can all learn from Gwyneth Paltrow

Terence Blacker in The Independent Friday, 9 September 2011





The scavengers who live off the scraps of celebrity scandal will be paying particular attention to the marriage of Gwyneth Paltrow and Chris Martin over the next few weeks. Not only are the couple blessed with talent, looks and success – provocation in itself – but she has just made a statement which has caused considerable outrage in some quarters.





Prepare to be shocked. "The more I live my life, the more I learn not to judge people for what they do," Gwyneth has said, quite openly and without apology. "I think we're all trying to do our best but life is complicated." As if that were not controversial enough, she added: "I know people I respect and admire and look up to who have had extra-marital affairs."



The response to these dangerous and reckless remarks has been predictable. There has been clammy speculation about the state of Paltrow's marriage. Seven years of married life, and suddenly she's relaxed about infidelity. What could be going on there? Then, naturally, there have been gusts of moral disapproval from the media. Famous people who express sane, reasonable views are instinctively mistrusted – we expect our celebrities to be out of touch and entertaining – and sneering reference has been made to Paltrow's "latest flirtation with controversy", not to mention her "superhuman compassion".



Paltrow is, though, making a worthwhile point. It is a very contemporary habit, the need to stand in judgment over every little muddle in which a public figure finds herself or himself and draw sorrowful conclusions from them, as a vicar does in a sermon. Scandal and misadventure have always been part of the media, but the busy drawing of moral lessons, the pious scolding from the sidelines, is something new. Disapproval is to the 21st century what primness was to the Victorians.



Both, to a large extent, are propelled by sexual frustration. Prurience, today as over a century ago, tends to disguise itself as moral concern – we need to know every excitingly shocking little detail, in order to condemn it. When a photograph was published recently of a uniformed American cop having sex with a woman on the hood of her car, it was published around the world – it was funny and played to some well-worn erotic fantasies. The story accompanying the picture, though, was the scandal, the abuse of power, the controversy. The randy cop was quickly fired.



It is worth remembering who is doing the judging on these occasions. Unlike the Victorians, today's wet-lipped moralists are not eminent politicians or churchmen. They are journalists.



Such is the hypocrisy when it comes to public misbehaviour that it is now taken for granted. When MPs were being pilloried for taking liberties with their allowances, the moral outrage was orchestrated by those who belonged to a profession in which, over the previous decades, the large-scale fiddling of expenses was a matter of competitive pride. Nor, for all their shrill moralising when a celebrity is caught in the wrong bed, are those who work in the media famous for their high standards of sobriety or sexual fidelity.



Here, perhaps, is the truth behind the new need to judge. Commentators scold, and readers allow themselves to be whipped into a state of excited disapproval, for reasons of guilt about their own less-than-perfect lives.



As Gwyneth Paltrow says, life can indeed be complicated. Taking the high moral ground is only worthwhile when something truly bad has been done. It is fine to be interested and excited by scandal, but why do we have to condemn quite so much? Since when have we all become so sanctimonious?



Few lives would bear the closest scrutiny day by day. Indeed, in a world more full of temptation and dodgy role-models than ever before, a bit of complication along the way may simply be a sign that you are living it to the full.



Tuesday 14 June 2011

Am I A Product Of The Institutions I Attended?

Amitabha Bagchi

I have been thinking for a while about how the institutions we affiliate ourselves to—or maybe our parents "admit" us to, or social pressures force us into—as students affect us, form us, shape us, turn our lives decisively down one of the many roads available to us. This question—Is what I am a product of the institutions I attended?—falls in the family of questions engendered by the basic question: What makes me who I am? This question, often asked before the perhaps more fundamental question—Who am I?—is not so easily answered. After all, our lives are produced by a complex interplay of factors, some determined in advance—race, class, gender, geography, personality, biology—and some random and contingent. The lens of science fails in the face of this complexity.

But the novelist, unlike the scientist, has a different relationship to questions. His job is not to answer them. His job is to put them into play. The unanswerable question is one of the basic tools of the storyteller's trade. Let me give you an example: Should Ram have made Sita take an agni parkisha because of what the washer man said? This question, so simple to state, is a vortex that begins spinning slowly, but then it widens and becomes stronger and stronger. As we argue and debate, it sucks in ship after ship of the fleet of human experience. What portion of a man's life is subject to his duty? How far does the power of love extend? What constitutes fidelity in a marriage? What is the nature of trust? Keep answering these questions, and like the asura Raktabija, who had a boon that every time a drop of his blood fell to the ground a new Raktabija would be born, a new set of questions emerges with each answer. The novelist's job, then, is to set questions into play, ornament them and lead them through the lives of people, and watch as they draw those lives into their fold.

And so as a novelist, I find myself asking this question—Am I a product of the institutions I attended?—in an attempt to open out a field of questions, in an attempt to add to the form of human knowledge that is full of errors and poetry, that form of human knowledge that is most intimate and personal.

Having used the P word—personal—let me start by saying that in the years since I left school I never thought that I would get an opportunity to thank NCERT for the impact it has had on my life. I could probably find a number of things to say in thanks, but let me just focus on one. In all my English textbooks since class nine I always found at least one story or play by a writer called William Saroyan. His stories of a young Armenian boy's life somewhere in the central part of California made a deep impression on me. In the years since, I have derived many things from those few stories I read. I learned that there is a deep sadness that lies right at the heart of the immigrant experience—something that the now fashionable generation of immigrant writers has never fully captured. I learned that a gentle kind of realism is the best way to describe the lives of people trying to live a dignified life in the face of hardship. I learned—and this is the one realization on which my brief writing career so far has rested, and, I suspect, whatever I write in future will also rest—that the strength of weak people is the stuff of literature. But it was only when I moved to California in 2002 that I learned that Saroyan is all but forgotten in his home country. That's when I really thanked the people who decided to put him into an NCERT textbook for almost every year since class nine.

Class nine was also my first year at a prominent school in South Delhi. Those of us who live in Delhi think of it as flat but every here and there we do come across small hills and this school is located on one such hill. So it happens that when I think back to this school and my days there I often find myself thinking of walking up an incline towards the large metal gates, manned by a chowkidar. I had been to other schools before that one, whose topography was as flat as the rest of the city's, but somehow when I think of school, I think of walking up a gentle slope, I think of a mass of grey boxy buildings sitting on a hill. Perhaps the fact that it is harder to walk up a hill than it is to walk on flat ground has something to do with it. When you reached those gates, there was an invisible membrane you passed through, like a scene from Star Trek where you stepped through a portal and you reached another dimension. Those gates were a valve, easily entered but hard to exit through. Those gates separated the world within the school from the world outside. Inside those gates we were safe from things we did not even know existed outside them. Within them lay a world of classrooms and corridors, playing field and Principal's office, labs and the library. And in each of these spaces there was a protocol, an acceptable way of carrying yourself, and an unacceptable way.

So school then is the place in which we learn what decorum is, and that each space has its own notion of decorum. But we learn this in what is to my mind the wrong way. We learn that decorum is linked to policing. That we should not be walking down a school corridor without an excuse during class time because a teacher may accost us. We learn that we should not talk too loudly in an unattended classroom, because someone may come in and drag us off to the Principal's office. And this structure of learning engenders another learning. We find those distant corners of the football field where cigarettes may be smoked. We figure out which shadows under which staircase are best suited for stealing kisses with our new love. We share stories of rules broken without consequence, we aspire to create narratives of ourselves as clever lawbreakers. We begin to value duplicity and deceit. Perhaps this process could redeem itself if it helped us lose our fear of authority. I have always believed that fear of authority causes psychic damage that diminishes human society, and that the social control we get in return does not justify what we lose. But the problem is that plotting and scheming to undermine authority because it is a subcultural imperative—as it becomes in these situations—does not rob us of our fear of authority. We remain fearful. And we become sly.

School was not only a spatial category, it was also a temporal one. School was the world of 7:40 am to 1:30 pm. It was a division of the first part of the day into neatly ordered chunks of time, never shorter than 20 minutes, never longer than 45. I have sometimes wondered about the daily routines, and their fixed nature. At first, rather unfairly, I used to think that social control was best enforced by controlling a person's time. Marx, in his own take on this matter, wrote about the centrality of the working day to the capitalist project. Not as theoretically developed as Marx's but I too had—and still have—a rebellious schoolboy's approach to the regimentation of time. But then I also began to think of it in another way. Is unplanned time as threatening as unmapped space? School, the place where space was made safe for us, was also a place where our time was organized for us: the day was chopped into a sequence of intervals, each interval to be used in a particular way.

I was one of those people who stayed on the straight and narrow, but in my school bus there were two older boys who revelled in informing students like me of their escapades. These escapades involved getting off the school bus just like the rest of us, but walking off in the other direction, through the government houses that neighboured our school, onwards to a South Indian restaurant on Rao Tula Ram Marg. They had their breakfast there, it took about half an hour, and then walked leisurely past Moti Bagh to the Sarojini Nagar railway station, reaching there around a quarter to nine. Then they boarded the Ring Railway that took about two hours to take them around the city and bring them back to where they began. Getting off the train they would head towards the now demolished Chanakya cinema, reaching in good time for the eleven o'clock show. That would last till around one pm, a convenient time to take a bus back to school, getting there just before the school bus left for home. It took me a while to realize that although these not-so-orderly schoolboys had rejected the school's way of organizing the morning hours, they had not rejected the notion that the morning hours needed to be organized.

Those two boys fell neatly into one category of the taxonomy we informally maintained in my academically oriented school. They were what were called bad students. After that category came good students and then brilliant students. There were other classifications too: some students were there to improve the school's results, some to fill its coffers and some to ensure that Delhi's political class looked upon our school favourably. But the various categories that we had in my school in Delhi—it was one of what we still call the "good" schools of Delhi—were to prove wholly inadequate when I graduated and found myself at college in IIT.

When I entered IIT Delhi in the early 90s, I happened to be assigned the same hostel that my cousin who had entered IIT in the middle of eighties had lived in. When given a choice between attending class and spending his time in the hostel's music room, I was told by some of my seniors who had known him, he preferred the latter. In this music room, he told me when I asked him, used to live a large collection of cassettes on which generation after generation of hostel residents had painstakingly recorded, from whatever source available, a fund of music that comprehensively represented the popular musical production of the American sixties and seventies. Rock musicians who were long forgotten in the US lived in recordings that were revered in our hostel at IIT. That music room formed the person he was, and the person he continues to be today. But, oddly enough, of the trove of music the music room had housed there remained but three tapes when I got there. I used to go there to study sometimes, because no one else seemed to have any use for that space. Outside that room, in the rest of the hostel, instead of long discussions over the superiority of Deep Purple over Led Zeppelin, now arguments raged between those who worshipped Madhuri Dixit and those whose hearts beat for Urmila Matondkar. In the common room next door, the newly installed cable TV was firmly tuned to the one or two channels that had discovered a business model built around twenty fours hours of Chitrahaar. Something had changed between the time my cousin had left and I had entered.

Today when Hindi soap operas command literally 20 times more viewer- ship than English programming, we know well enough the shape of the change. But at that time this churning was just beginning—obfuscated by pointless debates on the impact of cable television on "Indian culture". Each discipline—Economics, Sociology, Anthropology, Political Science—has its own explanations for this change. I myself think of it as the era in which the spread of coaching classes made it possible for people outside the metropolitan centres to succeed at the IIT entrance exam. At IIT we complain about the influence of the coaching class culture on the quality of our intake. But anecdotal evidence makes it amply clear that the rise of the coaching class culture meant the end of the dominance of English speaking elites from urban centres at IIT. The end of the dominance of people like me.

If someone were to look at the grade sheets from my first year they would conclude that I didn't learn much that year, but the truth of the matter is that I learned a lot. I learned, for example, that I loved carrom board and I was really good at it. I spent hours and hours playing carrom. In the process I made friendships with other people who spent hours and hours playing carrom. One day I was partnering a boy who was one year my senior, and we were playing against two others from his year. One of them, Gaurav, from a "good" school in Chandigarh, pointed to my partner and asked: Do you know what his name is? An odd question, I thought at that time. Of course I knew what his name was, I saw him every other day at the carrom room. His given name was Sumer Lal and his surname was one that I had learned by that time was shared by other people who got into IIT on the Scheduled Caste quota. "I know his name," I said. Gaurav, who hadn't a trace of any negative sentiment in his voice, said: "I didn't find out his name till the end of my first year." Gaurav, who probably became friends with the Rohits and Amits and Viveks within days of reaching the hostel, spent almost 12 months there before he learned Sumer Lal's name.

One of the interesting things we were all made to do during ragging was to read certain texts in Hindi written by a person whose name was always Mast Ram. The technical term for this literature was uttejak sahitya. We all had to read it, especially those of us who found it objectionable. I didn't find it objectionable, but for me a different task was assigned: I was made to translate it. Me and those few others who, the assigner of the task knew, would have trouble translating it. I knew the dirty words, that was not a problem, but I still struggled with the translation, stumbling over the heavily idiomatic language, the richly textured euphemisms that seemed to come so naturally to Mast Ram. It was probably the first time it struck me that my school Hindi textbooks had done me a disservice, and that the Hindi Cell style signage that I saw around the city was a total misrepresentation of a living breathing language. In those early days in the hostel, when I was keen to offer friendship to whoever IIT had arbitrarily chosen to put along with me in the hostel, I struggled to cross a barrier of language that my education in Delhi had created for me. But the people on the other side appreciated the fact that I did struggle, at least I think they did. And even if they didn't, several years later when I picked up and read end to end my first Hindi novel—Shrilal Shukla's Raag Darbari—I had them to thank for showing me that Hindi had a colloquial richness, a richness that would serve as a magnet for a person who loves language. And that magnetic attraction could take me to places I would not have otherwise chosen to go, shown me things about the country of my birth that I would not have otherwise chosen to see.

When I was in school my mother would sometimes go shopping at one of the prominent fresh produce markets of Delhi. On occasion we would stop at a South Indian dhaba that sat at the mouth of this market. Much to my astonishment some time into my stay at IIT I found that the dhaba was owned by the family of one of my closest friends at IIT—he is now a leading computer scientist in a prominent research lab in the US. I cannot forget the day he came to me, some time in our third year, and asked: "Bagchi, tu dose banaa letaa hai?" Before I could answer this question in the affirmative or negative he told me that his father was thinking of locking out the "labour" at the dhaba. "Ek do din maalik logon ko hi kaam karna padega." I nodded my agreement at the kind of prospect that I, the son of a civil servant father and schoolteacher mother, had never contemplated in my brief life. The thought of crossing the counter that I had sat on the customer side of sent a thrill up my spine. Unfortunately, or fortunately, the labour came around by that evening and I never did get to make dosas on the large tavas the dhaba had, but for a brief moment there I teetered at the edge of it, and I had to project out of my own world into another world where shop owners and labour squabbled while dosas waited to be made.

I cannot claim that the life I live now is fundamentally different in its everyday rhythms from the lives of the other English speaking students I went to school with. I cannot claim that what I learned in the years I was thrown into close contact with people who I had only seen from a distance before transformed me, because I have no way of knowing what I would have been like if I had not had that experience. But I do know that while I treasured what my teachers taught me at IIT—and treasured it enough to have joined their ranks today—I treasure equally, if not more, what I learned in the hostel's carrom room, in the canteen, in the corridors.

It is not my contention that we all learned to get along. Please do not think that I am trying to portray IIT as some happy melting pot of India's diversity. It was not that. It was as riven with casteism, communalism, classism, sexism and all the other ugly isms that our society nurtures. How could it not be? But by pretending that these things didn't matter, that exams and grades and job interviews were more important than all these things, it gave an opportunity to those who were willing to learn to get along with people who weren't like themselves. It gave a quixotic notion of an India populated by Indians a chance. Indians who were consumerist, over-ambitious, self-important technocrats perhaps, but who were, nonetheless, more Indian than anything else. And the fact is that this learning was not part of any of the curricula at IIT. But, as all of us who have been teachers for even a short while know, all we can do is give people an opportunity to learn. And if they don't learn, we can give them another opportunity, and another. Because the truth is that in a class of 100, there will only be four or five who get it the first time, only 10 or 15 who understand it in outline, and the remaining will take it in one ear and let it out of the other. I know people who still use the word "shadda" to refer to people who got into IIT through the SC/ST quotas, despite having played hard-fought games of volleyball in the same team as some of them, despite having stayed up long bleary-eyed hours preparing for exams along with them, despite having drunk too much and thrown up with them. Some people never learn. That is the teacher's frustration. But some people do learn and that is the teacher's reward. And, a priori, we teachers never know which is which.

It's a complex and random process, this interaction with young people that we teachers enter into for a living. It has many sides. Like so many other teachers I spend a lot of time thinking about my students, and, also like many other teachers, I don't spend enough time thinking about what they think of me. But when I do, I am forced to remember how I saw my teachers. Physically I saw them through a forest of dark haired heads—I always preferred to sit near the back of the class. I saw them standing up on the raised platform at the front of the class, on which the short looked tall and the tall looked taller. I took their careful grooming for granted—not realizing that if one of them turned up looking slovenly I would probably have been as upset or offended as the school's principal. I associated a certain amount of self-possession with them. And I thought of them as older. A small anecdote here: In class nine I entered a CBSE school and took Sanskrit instead of Hindi. My mother was concerned that I wouldn't be able to cope so she went to meet my teacher. Afterwards I asked her how the meeting went and she said: "Your Sanskrit teacher is a very sweet girl." I realized that my mother was probably fifteen or twenty years older than my Sanskrit teacher, and senior in the same profession, but still the idea that my teacher could be thought of, by anyone, as a "girl" was very difficult to comprehend. So difficult that I still remember that statement, long long after, I'm guessing, my mother forgot all about it.

So there you are, you poor teacher, frozen in eternal adulthood, even on those days when you wish you could just curl into a foetal position and suck your thumb instead of having to stand up and talk for an hour to a room full of young people who are looking at you, or at least should be looking at you. Sometimes in the nitty-gritty of the syllabus, the announcements about exams and homework, the clearing of the last class's doubts, you forget about the current that emerges from your body and flows out into the class. You forget what you mean to them.

I was lucky to have some excellent teachers at IIT Delhi, and I am not just saying that because some of them are my colleagues now. Let me explain with a story why I thought well of them. In my second year I had a class in computer architecture. Before the first semester exam, being somewhat lazy I didn't memorise certain assembly language keywords and their meanings. When the exam paper came there was one big question that involved explaining what a fragment of assembly language code did. It was impossible to answer without knowing the meaning of those keywords. One of my friends from the hostel who knew I hadn't memorised the keywords looked at me and snickered. Stung by this I decided to take a risk. I raised my hand and called the professor. "I don't know what these keywords mean," I said. He looked down at the paper, thought for a moment, then went to the board and wrote out the meanings of all the keywords. Right there, on the spot, he decided that this question was not a test of memory, it was a test of understanding. Not only did I snicker back at the friend who had laughed at me, I also never forgot the lesson. I apply it in my classes even today.

I knew from around the age of 19 that I wanted to be a professor. I was 30 when I actually became one. In those 11 years, especially towards the end of that period, I often used to daydream about the time when I would stand in front of my first class. When I dreamt about it I always saw myself standing in a particular lecture room at IIT Delhi, Block VI, Room 301, where most of my lectures in the latter part of my stay at IIT had been held. I would see myself standing up on the platform of VI 301 about to say my first words to my first class, and I knew I would be feeling something. I just didn't know what it was. As it turned out, my first teaching job was at IIT Delhi and when I got the room assignment for that first semester I found out that the class I was teaching would meet in VI 301. I walked up the one floor from my office, my stomach fluttering. I turned into that familiar door, carrying the attendance sheets, the sign of my authority, in my right hand, and walked onto the podium. I put the attendance sheets down on the table and turned towards the class. I looked up at them, seventy something of them, sitting in those long desks where I had so often sat and would never again sit. I looked at their faces and suddenly I ached at the pain they would feel in their lives. They sat there looking up at me, innocent to the suffering their future would bring them, and it came running through me, unexpectedly, this thought: There is so much you all will go through in your lives. Sometimes when I feel I am forgetting what my students mean to me and what I mean to them, I remind myself of that moment when I stood in front of my first class, that hot July day when I learned something about who I was and about the life I had chosen for myself.

Tuesday 8 March 2011

Spinoza, part 1: Philosophy as a way of life

For this 17th century outsider, philosophy is like a spiritual practice, whose goal is happiness and liberation

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 7 February 2011 09.30 GMT
o larger | smaller
o Article history

Spinoza memorial at the New Church in the Hague Spinoza memorial at the New Church in The Hague. Photograph: Dan Chung for the Guardian

Although Baruch Spinoza is one of the great thinkers of the European philosophical tradition, he was not a professional scholar – he earned his modest living as a lens grinder. So, unlike many thinkers of his time, he was unconstrained by allegiance to a church, university or royal court. He was free to be faithful to the pursuit of truth. This gives his philosophy a remarkable originality and intellectual purity – and it also led to controversy and charges of heresy. In the 19th century, and perhaps even more recently, "Spinozist" was still a term of abuse among intellectuals.

In a sense, Spinoza was always an outsider – and this independence is precisely what enabled him to see through the confusions, prejudices and superstitions that prevailed in the 17th century, and to gain a fresh and radical perspective on various philosophical and religious issues. He was born, in 1632, to Jewish Portuguese parents who had fled to Amsterdam to escape persecution, so from the very beginning he was never quite a native, never completely at home. Although Spinoza was an excellent student in the Jewish schools he attended, he came to be regarded by the leaders of his community as a dangerous influence. At the age of 24 he was excluded from the Amsterdam synagogue for his "intolerable" views and practices.

Spinoza's most famous and provocative idea is that God is not the creator of the world, but that the world is part of God. This is often identified as pantheism, the doctrine that God and the world are the same thing – which conflicts with both Jewish and Christian teachings. Pantheism can be traced back to ancient Greek thought: it was probably advocated by some pre-Socratic philosophers, as well as by the Stoics. But although Spinoza – who admired many aspects of Stoicism – is regarded as the chief source of modern pantheism, he does, in fact, want to maintain the distinction between God and the world.

His originality lies in the nature of this distinction. God and the world are not two different entities, he argues, but two different aspects of a single reality. Over the next few weeks we will examine this view in more detail and consider its implications for human life. Since Spinoza presents a radical alternative to the Cartesian philosophy that has shaped our intellectual and cultural heritage, exploring his ideas may lead us to question some of our deepest assumptions.

One of the most important and distinctive features of Spinoza's philosophy is that it is practical through and through. His ideas are never merely intellectual constructions, but lead directly to a certain way of life. This is evidenced by the fact that his greatest work, which combines metaphysics, theology, epistemology, and human psychology, is called Ethics. In this book, Spinoza argues that the way to "blessedness" or "salvation" for each person involves an expansion of the mind towards an intuitive understanding of God, of the whole of nature and its laws. In other words, philosophy for Spinoza is like a spiritual practice, whose goal is happiness and liberation.

The ethical orientation of Spinoza's thought is also reflected in his own nature and conduct. Unlike most of the great philosophers, Spinoza has a reputation for living an exemplary, almost saintly life, characterised by modesty, gentleness, integrity, intellectual courage, disregard for wealth and a lack of worldly ambition. According to Bertrand Russell, Spinoza was "the noblest and most lovable of the great philosophers". Although his ideas were despised by many of his contemporaries, he attracted a number of devoted followers who gathered regularly at his home in Amsterdam to discuss his philosophy. These friends made sure that Spinoza's Ethics was published soon after his death in 1677.

Spinoza, part 2: Miracles and God's will

Spinoza's belief that miracles were an unexplained act of nature, not proof of God, proved dangerous and controversial

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 14 February 2011 09.00 GMT
o larger | smaller
o Article history

At the heart of Baruch Spinoza's philosophy is a challenge to the traditional Judeo-Christian view of the relationship between God and the world. While the Hebrew Bible and the Christian scriptures share a conception of God as the creator of the natural world and the director of human history, Spinoza argues that everything that exists is an aspect of God that expresses something of the divine nature. This idea that God is not separate from the world is expounded systematically in the Ethics, Spinoza's magnum opus. However, a more accessible introduction to Spinoza's view of the relationship between God and nature can be found in his discussion of miracles in an earlier text, the Theologico-Political Treatise. This book presents an innovative interpretation of the bible that undermines its authority as a source of truth, and questions the traditional understanding of prophecy, miracles and the divine law.

In chapter six of the Theologico-Political Treatise, Spinoza addresses the "confused ideas of the vulgar" on the subject of miracles. Ordinary people tend to regard apparently miraculous events – phenomena that seem to interrupt and conflict with the usual order of nature – as evidence of God's presence and activity. In fact, it is not just "the vulgar" who hold this view: throughout history, theologians have appealed to miracles to justify religious belief, and some continue to do so today.

For Spinoza, however, talk of miracles is evidence not of divine power, but of human ignorance. An event that appears to contravene the laws of nature is, he argues, simply a natural event whose cause is not yet understood. Underlying this view is the idea that God is not a transcendent being who can suspend nature's laws and intervene in its normal operations. On the contrary, "divine providence is identical with the course of nature". Spinoza argues that nature has a fixed and eternal order that cannot be contravened. What is usually, with a misguided anthropomorphism, called the will of God is in fact nothing other than this unchanging natural order.

From this it follows that God's presence and character is revealed not through apparently miraculous, supernatural events, but through nature itself. As Spinoza puts it: "God's nature and existence, and consequently His providence, cannot be known from miracles, but can all be much better perceived from the fixed and immutable order of nature."

Of course, this view has serious consequences for the interpretation of scripture, since both the Old and New Testaments include many descriptions of miraculous events. Spinoza does not simply dismiss these biblical narratives, but he argues that educated modern readers must distinguish between the opinions and customs of those who witnessed and recorded miracles, and what actually happened. Challenging the literal interpretation of scripture that prevailed in his times, Spinoza insists that "many things are narrated in Scripture as real, and were believed to be real, which were in fact only symbolic and imaginary".

This may seem reasonable enough to many contemporary religious believers, but Spinoza's attitude to the Bible was far ahead of its time. Today we take for granted a certain degree of cultural relativism, and most of us are ready to accept that ancient peoples understood the world differently from us, and therefore had different ideas about natural and divine causation. When it was first published in 1670, however, the Theologico-Political Treatise provoked widespread protest and condemnation. In fact, it was this reaction that made Spinoza decide to delay publication of the Ethics until after his death, to avoid more trouble.

But what are we to make of Spinoza's claim that God's will and natural law are one and the same thing? There are different ways to interpret this idea, some more conducive to religious belief than others. On the one hand, if God and nature are identical then perhaps the concept of God becomes dispensable. Why not simply abandon the idea of God altogether, and focus on improving our understanding of nature through scientific enquiry? On the other hand, Spinoza seems to be suggesting that God's role in our everyday lives is more constant, immediate and direct than for those who rely on miraculous, out-of-the-ordinary events as signs of divine activity.

And of course, the idea that the order of nature reveals the existence and essence of God leads straight to the view that nature is divine, and should be valued and even revered as such. In this way, Spinoza was an important influence on the 19th-century Romantic poets. Indeed, Spinoza's philosophy seems to bring together the Romantic and scientific worldviews, since it gives us reason both to love the natural world, and to improve our understanding of its laws.

Spinoza, part 3: What God is not

In his Ethics, Spinoza wanted to liberate readers from the dangers of ascribing human traits to God

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 21 February 2011 08.30 GMT
o larger | smaller
o Article history

Spinoza's Ethics is divided into five books, and the first of these presents an idiosyncratic philosophical argument about the existence and nature of God. We'll examine this in detail next week, but first we need to look more closely at how the Ethics challenges traditional Judeo-Christian belief in God.

The view that Spinoza wants to reject can be summed up in one word: anthropomorphism. This means attributing human characteristics to something non-human – typically, to plants or animals, or to God. There are several important implications of Spinoza's denial of anthropomorphism. First, he argues that it is wrong to think of God as possessing an intellect and a will. In fact, Spinoza's God is an entirely impersonal power, and this means that he cannot respond to human beings' requests, needs and demands. Such a God neither rewards nor punishes – and this insight rids religious belief of fear and moralism.

Second, God does not act according to reasons or purposes. In refusing this teleological conception of God, Spinoza challenged a fundamental tenet of western thought. The idea that a given phenomenon can be explained and understood with reference to a goal or purpose is a cornerstone of Aristotle's philosophy, and medieval theologians found this fitted very neatly with the biblical narrative of God's creation of the world. Aristotle's teleological account of nature was, then, adapted to the Christian doctrine of a God who made the world according to a certain plan, analogous to a human craftsman who makes artefacts to fulfil certain purposes. Typically, human values and aspirations played a prominent role in these interpretations of divine activity.

Spinoza concludes book one of the Ethics by dismissing this world view as mere "prejudice" and "superstition". Human beings, he suggests, "consider all natural things as means to their own advantage", and because of this they believe in "a ruler of nature, endowed with human freedom, who had taken care of all things for them, and made all things for their use". Moreover, people ascribe to this divine ruler their own characters and mental states, conceiving God as angry or loving, merciful or vengeful. "So it has happened that each person has thought up from his own temperament different ways of worshiping God, so that God might love him above all others, and direct the whole of nature according to the needs of his blind desire and insatiable greed," writes Spinoza.

It is interesting to compare this critique of religious "superstition" with the views of the 18th-century Scottish philosopher David Hume. In his Dialogues Concerning Natural Religion, Hume challenges the popular belief in a creator God – and he also, elsewhere, undermines appeals to miracles as evidence of divine activity. Although Hume seems to echo Spinoza on these points, there is a crucial difference between the two philosophers. Hume thinks that many aspects of Christian belief are silly and incoherent, but his alternative to such "superstition" is a healthy scepticism, which recognises that religious doctrines cannot be justified by reason or by experience. His own position is rather ambiguous, but it involves a modest and pragmatic attitude to truth and seems to lead to agnosticism.

Spinoza, on the other hand, thinks that there is a true conception of God which is accessible to human intelligence. He argues that misguided religious beliefs are dangerous precisely because they obscure this truth, and thus prevent human beings from attaining genuine happiness, or "blessedness". There is, therefore, more at stake in Spinoza's critique of popular superstition than in Hume's. For Hume, religious believers are probably wrong, but the existential consequences of their foolishness might not be particularly serious. Spinoza, by contrast, wants to liberate his readers from their ignorance in order to bring them closer to salvation.

So Spinoza is not simply an atheist and a critic of religion, nor a sceptical agnostic. On the contrary, he places a certain conception of God at the heart of his philosophy, and he describes the ideal human life as one devoted to love of this God. Moreover, while Spinoza is critical of superstition, he is sympathetic to some aspects of Jewish and Christian teaching. In particular, he argues that Jesus had a singularly direct and immediate understanding of God, and that it is therefore right to see him as the embodiment of truth, and a role model for all human beings.

Spinoza, part 4: All there is, is God

Being infinite and eternal, God has no boundaries, argues Spinoza, and everything in the world must exist within this God

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 28 February 2011 10.00 GMT
o larger | smaller
o Article history

So far in this series I've focused on Spinoza's critique of the religious and philosophical world view of his time. But what does he propose in place of anthropomorphic, anthropocentric belief in a transcendent creator God?

Spinoza begins his Ethics by defining some basic philosophical terms: substance, attribute, and mode. In offering these definitions, he is actually attempting a radical revision of the philosophical vocabulary used by Descartes, the leading thinker of his time, to conceptualise reality. When we understand these terms properly, argues Spinoza, we have to conclude that there exists only one substance – and that this is God.

Substance is a logical category that signifies independent existence: as Spinoza puts it, "by substance I understand what is conceived through itself". By contrast, attributes and modes are properties of a substance, and are therefore logically dependent on this substance. For example, we might regard a particular body as a substance, and this body is not conceptually dependent on anything else. But the body's properties, such as its weight and its colour and its shape, are qualities that cannot be conceived to exist in isolation: they must be the weight, colour and shape of a certain body.

Descartes's world view draws on Aristotelian metaphysics and scholastic theology in conceiving individual entities as distinct substances. Human beings, for example, are finite substances, while God is a special substance which is infinite and eternal. In fact, Descartes thought that each human being was composed of two substances: a mind, which has the principal attribute of thought; and a body, which has the principal attribute of extension, or physicality. This view famously leads to the difficult question of how these different substances could interact, known as the "mind-body problem".

The philosophical terminology of substance, attribute and mode makes all this sound rather technical and abstract. But Cartesian metaphysics represents a way of thinking about the world, and also about ourselves, shared by most ordinary people. We see our world as populated by discrete objects, individual things – this person over here, that person over there; this computer on the table; that tree outside, and the squirrel climbing its trunk; and so on. These individual beings have their own characteristics, or properties: size, shape, colour, etc. They might be hot or cold, quiet or noisy, still or in motion, and such qualities can be more or less changeable. This way of conceptualising reality is reflected in the structure of language: nouns say what things are, adjectives describe how they are, and verbs indicate their actions, movements and changing states. The familiar distinction between nouns, adjectives and verbs provides an approximate guide to the philosophical concepts of substance, mode and attribute.

If, as Spinoza argues, there is only one substance – God – which is infinite, then there can be nothing outside or separate from this God. Precisely because God is a limitless, boundless totality, he must be an outsideless whole, and therefore everything else that exists must be within God. Of course, these finite beings can be distinguished from God, and also from one another – just as we can distinguish between a tree and its green colour, and between the colour green and the colour blue. But we are not dealing here with the distinction between separate substances that can be conceived to exist independently from one another.

Again, this is rather abstract. As Aristotle suggested, we cannot think without images, and I find it helpful to use the image of the sea to grasp Spinoza's metaphysics. The ocean stands for God, the sole substance, and individual beings are like waves – which are modes of the sea. Each wave has its own shape that it holds for a certain time, but the wave is not separate from the sea and cannot be conceived to exist independently of it. Of course, this is only a metaphor; unlike an infinite God, an ocean has boundaries, and moreover the image of the sea represents God only in the attribute of extension. But maybe we can also imagine the mind of God – that is to say, the infinite totality of thinking – as like the sea, and the thoughts of finite beings as like waves that arise and then pass away.

Spinoza's world view brings to the fore two features of life: dependence and connectedness. Each wave is dependent on the sea, and because it is part of the sea it is connected to every other wave. The movements of one wave will influence all the rest. Likewise, each being is dependent on God, and as a part of God it is connected to every other being. As we move about and act in the world, we affect others, and we are in turn affected by everything we come into contact with.

This basic insight gives Spinoza's philosophy its religious and ethical character. In traditional religion, dependence and connectedness are often expressed using the metaphor of the family: there is a holy father, and in some cases a holy mother; and members of the community describe themselves as brothers and sisters. This vocabulary is shared by traditions as culturally diverse as Christianity, Buddhism and Islam. For Spinoza, the familial metaphor communicates a truth that can also be conveyed philosophically – through reason rather than through an image.

Spinoza, part 5: On human nature

We are not autonomous individuals but part of a greater whole, says Spinoza, and there is no such thing as human free will

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 7 March 2011 09.00 GMT
o larger | smaller
o Article history

Last week, we examined Spinoza's metaphysics, looking at how his radical reinterpretation of the philosophical terminology of substance, attribute and mode produces a new vision of reality. According to Spinoza, only God can be called a substance – that is to say, an independently existing being – and everything else is a mode of this single substance. But what does this mean for us?

One of the central questions of philosophy is: what is a human being? And this question can be posed in a more personal way: who am I? As we might by now expect, Spinoza's view of the human being challenges commonsense opinions as well as prevailing philosophical and religious ideas. We are probably inclined to think of ourselves as distinct individuals, separate from other beings. Of course, we know that we have relationships to people and objects in the world, but nevertheless we see ourselves as autonomous – a view that is reflected in the widelyheld belief that we have free will. This popular understanding of the human condition is reflected in Cartesian philosophy, which conceives human beings as substances. In fact, Descartes thought that human beings are composed of two distinct substances: a mind and a body.

For Spinoza, however, human beings are not substances, but finite modes. (Last week, I suggested that a mode is something like a wave on the sea, being a dependent, transient part of a far greater whole.) This mode has two aspects, or attributes: extension, or physical embodiment; and thought, or thinking. Crucially, Spinoza denies that there can be any causal or logical relationships across these attributes. Instead, he argues that each attribute constitutes a causal and logical order that fully expresses reality in a certain way. So a human body is a physical organism which expresses the essence of that particular being under the attribute of extension. And a human mind is an intellectual whole that expresses this same essence under the attribute of thinking.

But this is not to suggest that the mind and the body are separate entities – for this would be to fall back into the Cartesian view that they are substances. On the contrary, says Spinoza, mind and body are two aspects of a single reality, like two sides of a coin. "The mind and the body are one and the same individual, which is conceived now under the attribute of thought, now under the attribute of extension," he writes in book two of the Ethics. And for this reason, there is an exact correspondence between them: "The order and connection of ideas is the same as the order and connection of things." In fact, each human mind involves awareness of a human body.

This way of thinking has some important consequences. One of the most obvious is that it undermines dualistic and reductionist accounts of the human being. Descartes's mind-body dualism involves the claim that we are, in essence, thinking beings – that the intellectual should be privileged above the physical, reason above the body. Conversely, modern science often regards the human being as primarily a physical entity, and attempts to reduce mental activity to physical processes. In Spinoza's view, however, it is incoherent to attempt to explain the mental in terms of the physical, or vice versa, because thinking and extension are distinct explanatory orders. They offer two alternative ways of describing and understanding our world, and ourselves, which are equally complete and equally legitimate.

Another important consequence of Spinoza's account of the human being is his denial of free will. If we are modes rather than substances, then we cannot be self-determining. The human body is part of a network of physical causality, and the human mind is part of a network of logical relations. In other words, both our bodily movements and our thinking are constrained by certain laws. Just as we cannot defeat the law of gravity, so we cannot think that 2 + 2 = 5, or that a triangle has four sides.

Spinoza's criticism of the popular belief in free will is rather similar to his analysis of belief in miracles in the Theologico-Political Treatise, which we looked at a few weeks ago. There, we may recall, he argued that people regard events as miraculous and supernatural when they are ignorant of their natural causes. Likewise, human actions are attributed to free will when their causes are unknown: "That human freedom which all men boast of possessing … consists solely in this, that men are conscious of their desire and unaware of the causes by which they are determined." For Spinoza, belief in free will is just as much a sign of ignorance and superstition as belief in miracles worked by divine intervention.