'People will forgive you for being wrong, but they will never forgive you for being right - especially if events prove you right while proving them wrong.' Thomas Sowell
Search This Blog
Saturday, 13 July 2024
Monday, 28 June 2021
Friday, 1 January 2021
What we have learnt about the limits of science
Thiago Carvalho in The FT
Some years ago, on New Year’s Day, my wife and I noticed that our son, not yet two months old, was struggling to breathe — a belaboured, wheezing effort was all he could manage — and we decided to face the holiday emergency room crush. After assessing his blood oxygen levels, the pediatrician said: “Pack a bag, you will be here all week. He will get worse. Then he will get better.”
Our son had contracted something called respiratory syncytial virus, and it was replicating in his lungs. In a scenario similar to Covid-19, most healthy adults infected with RSV will experience a mild cold, or no symptoms at all. However, some unfortunate infants who contract RSV may suffer a devastating pulmonary infection. For those kids, there are no drugs available that can reliably stop, or even slow down RSV’s relentless replication in the lungs.
Instead, according to Mustafa Khokha, a pediatric critical care professor at Yale University, doctors first give oxygen and then if the child does not improve, there follows a series of progressively more aggressive procedures. “That’s all supportive therapy for the body to recover, as opposed to treatment against the virus itself,” says Khokha. Outstanding supportive care was what our son received, and the week unfolded exactly as his pediatrician predicted. (It was still the worst week of my life.)
For all the progress we have seen in 2020, a patient brought to the emergency room with severe Covid-19 will essentially receive the same kind of supportive care our son did — treatment to help the body endure a viral assault, but not effectively targeting the virus itself. The main difference will be the uncertain outcome — there will be no comforting, near-certain “he will get better” from the attending physician.
Contrast that story with a different one. On a Tuesday morning in early December, in the English city of Coventry, Margaret Keenan, just a few days shy of her 91st birthday, became the first person in the world to receive the BioNTech/Pfizer Covid-19 vaccine outside of a clinical trial. The pace of progress was astonishing. It was less than a year since, in the closing moments of 2019, Chinese health authorities alerted the World Health Organization to an outbreak of a pneumonia of unknown cause in Hubei province.
The Covid-19 pandemic has given us an accelerated tutorial on the promise and the limits of science. With vaccines, testing, epidemiological surveillance, we know where we are going, and we have a good idea how to get there. These are essentially challenges of technological development, reliant now on adequate resources and personnel and tweaking of regulatory frameworks. For other scientific challenges, though, there may be no gas pedal to step on — these include the prickly problems of arresting acute viral infection, or understanding how the virus and the host interact to produce disease. Science, as Nobel Prize-winning immunologist Peter Medawar put it, is the art of the soluble.
In March, when, incredibly, the first human vaccine trials for Covid-19 were kicking off in Seattle, the WHO launched an ambitious clinical trial to try to identify existing pharmaceuticals that could show some benefit against Sars-Cov-2. In October, the WHO declared that all four arms of its Solidarity trial had essentially failed. The search for effective antiviral drugs has not lacked resources or researchers, but in contrast to the vaccine victories, it has yet to produce a single clear success story. The concentrated efforts of many of the world’s most capable scientists, relying on ample public support and private investment, are sometimes not enough to crack a problem.
Perhaps nothing exemplifies this more clearly than what followed Richard Nixon’s signing of the National Cancer Act on December 23 1971. The act was cautiously phrased, but January’s State of the Union address declared an all-out war on cancer: “The time has come in America when the same kind of concentrated effort that split the atom and took man to the moon should be turned toward conquering this dread disease.” The war on cancer would funnel almost $1.6bn to cancer labs over the next three years, and fuel expectations that a cure for the disease would be found before the end of the decade. Curing cancer remains, of course, an elusive target. In 2016, then vice-president Joe Biden presented the report of his own Cancer Moonshot task force.
The success of the Apollo program planted the Moonshot analogy in the science policy lexicon. Some grand challenges in biology could properly be considered “moonshots”. The Human Genome Project was one example. Like the race to the Moon, it had a clear finish line: to produce a draft with the precise sequence of genetic letters in the 23 pairs of human chromosomes. This was, like the propulsion problems solved by Nasa en route to the Moon, a matter of developing and perfecting technology — technology that later would allow us to have a genetic portrait of the cause of Covid-19 in under two weeks.
The cancer context was rather different. In the countdown to the war on cancer, Sol Spiegelman, the director of Columbia University’s Institute of Cancer Research, quipped that “an all-out effort at this time [to find a cure for cancer] would be like trying to land a man on the Moon without knowing Newton’s laws of gravity.” And so it proved.
We now know quite a lot about how the body resists viral infections, certainly much more than we knew about the biology of cancer in 1971. Over 60 years ago, at London’s National Institute for Medical Research, Alick Isaacs and Jean Lindemann exposed fragments of chicken egg membranes to heat-inactivated influenza A virus. In a matter of hours, the liquid from these cultures acquired the capacity to interfere with the growth of not only influenza A, but other, unrelated viruses, as well. Isaacs and Lindemann named their factor interferon. Interferons are fleet-footed messengers produced and released by cells almost immediately upon viral infection. These molecules warn other host cells to ready themselves to resist a viral onslaught.
Viruses rely on hijacking the normal cellular machinery to make more copies of themselves and interferons interfere with almost all stages of the process: from making it more difficult for the virus to enter cells, to slowing down the cellular protein factories required to make the viral capsule, to reducing the export of newly made viral particles. Interferons are now part of our pharmaceutical armoury for diseases as diverse as multiple sclerosis and cancer, as well as hepatitis C and other chronic viral infections.
Multiple interferon-based strategies have been tried in the pandemic, from intravenous administration to nebulising the molecule so that the patient inhales an antiviral mist directly into the lungs. The results have been inconclusive. “A lot of it has to do with the timing,” says Yale immunologist Akiko Iwasaki, “the only stage that recombinant interferon might be effective is pre-exposure or early post-exposure, and it’s really hard to catch it for this virus, because everyone is pretty much asymptomatic at that time.”
This year’s scramble for effective antiviral drugs led to a revival of other failed approaches. In 2016, a team of researchers from the United States Army Medical Research Institute of Infectious Diseases in Frederick, Maryland, and the biotech company Gilead Sciences reported that the molecule GS-5734 protected Rhesus monkeys from being infected with the Ebola virus. GS-5734, or as it is more familiarly known now, remdesivir unfortunately failed in clinical trials. This was a bona fide antiviral, backed up by demonstrations that the drug efficiently blocked an enzyme used by viruses to copy their genome. Ebola was already remdesivir’s third dead-end: Gilead had previously tested GS-5734 against hepatitis C and RSV, and the results were disappointing.
In late April, National Institute of Allergy and Infectious Diseases director Anthony Fauci, a member of the White House coronavirus task force, proclaimed that the US remdesivir trials had established “a new standard of care” for Covid-19 patients. As has happened repeatedly during the Covid-19 crisis, the data backing this claim not been made public, nor had it, at the time, been peer-reviewed.
Fauci explained that the drug had no significant effect on mortality, but claimed that remdesivir reduced hospitalisation times by about 30 per cent. It was the first piece of good news in a spring marked by global lockdowns. Unfortunately, results from a large-scale trial run by the WHO released in the autumn failed to support even the limited claims of the US study (Gilead has challenged the study’s design), and the WHO currently advises against giving remdesivir to Covid-19 patients.
For those who do not naturally control Sars-Cov-2 infection, or who have not been vaccinated, the failure to repurpose or create effective antiviral agents leaves supportive care. We are only beginning to understand the interplay of this new virus and human hosts. It is also a protean affliction, as sex, age, and pre-existing conditions all affect outcomes. The single clearest way to reduce mortality remains the dexamethasone treatment for patients requiring supplemental oxygen initially reported in the UK Recovery trial. It is not a direct attack on the virus, but a way to ameliorate the effects of infection and the immune response to it on the human body. Dexamethasone is, in a very real sense, supportive care.
So what have we learned about the limits of science? First, we were reminded that spectacular successes are built on a foundation of decades of basic research. Even the novel, first-in-class vaccines are at the end of a long road. It was slow-going to get to warp speed. We learned that there are no shortcuts to deciphering how a new virus makes us sick (and kills us) and that there is no ignoring the importance of human diversity for cracking this code. Diabetes, obesity, hypertension — we are still finding our way through a comorbidity labyrinth. Most of all, we have learned an old lesson again: science is the art of the soluble. No amount of resources and personnel, no Manhattan Project, can ensure that science will solve a problem in the absence of a well-stocked toolbox and a solid, painstakingly built theoretical framework.
South Korea recorded its first Covid-19 case on January 20. Eleven days later, Spain confirmed its first infection: a German tourist in the Canary Islands. Spain and South Korea have similar populations of about 50m people. As of publication of this piece, South Korea has had 879 deaths, while Spain reports over 50,000. The west missed its moment. Efficient testing, tracing and containment of Covid-19 was a soluble technological and organisational problem. Here too, we can hear echoes of the war on cancer. The biggest single reduction in cancer mortality did not come from a miracle drug. It was the drop in lung cancer deaths, due to what we could call the war on tobacco. Perhaps Dr Spiegelman might concede that even if we don’t have a law of gravity, we do have a first law of medicine: always start with prevention.
Covid-19 has pushed science to its limits and, in some cases, sharply outlined its borders. This century’s first pandemic finds humanity, with its transport hubs and supply chains, more vulnerable to a new pathogen. But virology, immunology, critical care medicine and epidemiology, to name a few, have progressed immeasurably since 1918. Unfortunately, in a public health emergency, the best science must be used to inform the best policies. In the seasonal spirit of charity, let us say that that has not always been the case in our pandemic year.
Thursday, 19 September 2019
For the sake of life on Earth, we must put a limit on wealth
It is not quite true that behind every great fortune lies a great crime. Musicians and novelists, for example, can become extremely rich by giving other people pleasure. But it does appear to be universally true that in front of every great fortune lies a great crime. Immense wealth translates automatically into immense environmental impacts, regardless of the intentions of those who possess it. The very wealthy, almost as a matter of definition, are committing ecocide.
Greta Thunberg to Congress: ‘You’re not trying hard enough. Sorry’
A few weeks ago, I received a letter from a worker at a British private airport. “I see things that really shouldn’t be happening in 2019,” he wrote. Every day he sees Global 7000 jets, Gulfstream G650s and even Boeing 737s take off from the airport carrying a single passenger, mostly flying to Russia and the US. The private Boeing 737s, built to take 174 passengers, are filled at the airport with around 25,000 litres of fuel. That’s as much fossil energy as a small African town might use in a year.
Where are these single passengers going? Perhaps to visit one of their superhomes, constructed and run at vast environmental cost, or to take a trip on their superyacht, which might burn 500 litres of diesel an hour just ticking over, and which is built and furnished with rare materials extracted at the expense of beautiful places. The most expensive yacht in the world, costing £3bn, is a preposterous slab of floating bling called History Supreme. It carries 100 tonnes of gold and platinum wrapped around almost every surface, even the anchor.
Perhaps we shouldn’t be surprised to learn that when Google convened a meeting of the rich and famous at the Verdura resort in Sicily in July to discuss climate breakdown, its delegates arrived in 114 private jets and a fleet of megayachts, and drove around the island in supercars. Even when they mean well, the ultrarich cannot help trashing the living world.
‘Superyachts, built and furnished with rare materials, can burn 500 litres of diesel per hour just ticking over.’ The superyacht Aviva off the Cornish coast. Photograph: Simon Maycock/Alamy Stock Photo
A series of research papers shows that income is by far the most important determinant of environmental impact. It doesn’t matter how green you think you are; if you have surplus money, you spend it. The only form of consumption that’s clearly and positively correlated with good environmental intentions is diet: people who see themselves as green tend to eat less meat and more organic vegetables. But attitudes have little bearing on the amount of transport fuel, home energy and other materials you consume. Money conquers all.
The disastrous effects of spending power are compounded by the psychological impacts of being wealthy. Plenty of studies show that the richer you are, the less you are able to connect with other people. Wealth suppresses empathy. One paper reveals that drivers in expensive cars are less likely to stop for people using pedestrian crossings than drivers in cheap cars. Another revealed that rich people were less able than poorer people to feel compassion towards children with cancer. Though they are disproportionately responsible for our environmental crises, the rich will be hurt least and last by planetary disaster, while the poor are hurt first and worst. The richer people are, the research suggests, the less such knowledge is likely to trouble them.
Another issue is that wealth limits the perspectives of even the best-intentioned people. This week, Bill Gates argued in an interview with the Financial Times that divesting (ditching stocks) from fossil fuels is a waste of time. It would be better, he claimed, to pour money into disruptive new technologies with lower emissions. Of course we need new technologies. But he has missed the crucial point: in seeking to prevent climate breakdown, what counts is not what you do but what you stop doing. It doesn’t matter how many solar panels you install if you don’t simultaneously shut down coal and gas burners. Unless existing fossil fuel plants are retired before the end of their lives, and all exploration and development of new fossil fuel reserves is cancelled, there is little chance of preventing more than 1.5C of global heating.
But this requires structural change, which involves political intervention as well as technological innovation: anathema to Silicon Valley billionaires. It demands an acknowledgement that money is not a magic wand that makes all the bad stuff go away.
Tomorrow, I’ll be joining the global climate strike, in which adults will stand with the young people whose call to action has resonated around the world. As a freelancer, I’ve been wondering who I’m striking against. Myself? Yes: one aspect of myself, at least. Perhaps the most radical thing we can now do is to limit our material aspirations. The assumption on which governments and economists operate is that everyone strives to maximise their wealth. If we succeed in this task, we inevitably demolish our life support systems. Were the poor to live like the rich, and the rich to live like the oligarchs, we would destroy everything. The continued pursuit of wealth in a world that has enough already (albeit very poorly distributed) is a formula for mass destitution.
A meaningful strike in defence of the living world is, in part, a strike against the desire to raise our incomes and accumulate wealth: a desire shaped, more than we are probably aware, by dominant social and economic narratives. I see myself as striking in support of a radical and disturbing concept: enough. Individually and collectively, it is time to decide what “enough” looks like, and how to know when we’ve achieved it.
There’s a name for this approach, coined by the Belgian philosopher Ingrid Robeyns: limitarianism. Robeyns argues that there should be an upper limit to the amount of income and wealth a person can amass. Just as we recognise a poverty line, below which no one should fall, we should recognise a riches line, above which no one should rise. This call for a levelling down is perhaps the most blasphemous idea in contemporary discourse.
But her arguments are sound. Surplus money allows some people to exercise inordinate power over others: in the workplace; in politics; and above all in the capture, use and destruction of the planet’s natural wealth. If everyone is to flourish, we cannot afford the rich. Nor can we afford our own aspirations, which the culture of wealth maximisation encourages.
The grim truth is that the rich are able to live as they do only because others are poor: there is neither the physical nor ecological space for everyone to pursue private luxury. Instead we should strive for private sufficiency, public luxury. Life on Earth depends on moderation.
Tuesday, 25 September 2018
Why western philosophy can only teach us so much
One of the great unexplained wonders of human history is that written philosophy first flowered entirely separately in different parts of the globe at more or less the same time. The origins of Indian, Chinese and ancient Greek philosophy, as well as Buddhism, can all be traced back to a period of roughly 300 years, beginning in the 8th century BC.
These early philosophies have shaped the different ways people worship, live and think about the big questions that concern us all. Most people do not consciously articulate the philosophical assumptions they have absorbed and are often not even aware that they have any, but assumptions about the nature of self, ethics, sources of knowledge and the goals of life are deeply embedded in our cultures and frame our thinking without our being aware of them.
Yet, for all the varied and rich philosophical traditions across the world, the western philosophy I have studied for more than 30 years – based entirely on canonical western texts – is presented as the universal philosophy, the ultimate inquiry into human understanding. Comparative philosophy – study in two or more philosophical traditions – is left almost entirely to people working in anthropology or cultural studies. This abdication of interest assumes that comparative philosophy might help us to understand the intellectual cultures of India, China or the Muslim world, but not the human condition.
This has become something of an embarrassment for me. Until a few years ago, I knew virtually nothing about anything other than western philosophy, a tradition that stretches from the ancient Greeks to the great universities of Europe and the US. Yet, if you look at my PhD certificate or the names of the university departments where I studied, there is only one, unqualified, word: philosophy. Recently and belatedly, I have been exploring the great classical philosophies of the rest of the world, travelling across continents to encounter them first-hand. It has been the most rewarding intellectual journey of my life.
My philosophical journey has convinced me that we cannot understand ourselves if we do not understand others. Getting to know others requires avoiding the twin dangers of overestimating either how much we have in common or how much divides us. Our shared humanity and the perennial problems of life mean that we can always learn from and identify with the thoughts and practices of others, no matter how alien they might at first appear. At the same time, differences in ways of thinking can be both deep and subtle. If we assume too readily that we can see things from others’ points of view, we end up seeing them from merely a variation of our own.
To travel around the world’s philosophies is an opportunity to challenge the beliefs and ways of thinking we take for granted. By gaining greater knowledge of how others think, we can become less certain of the knowledge we think we have, which is always the first step to greater understanding.
Take the example of time. Around the world today, time is linear, ordered into past, present and future. Our days are organised by the progression of the clock, in the short to medium term by calendars and diaries, history by timelines stretching back over millennia. All cultures have a sense of past, present and future, but for much of human history this has been underpinned by a more fundamental sense of time as cyclical. The past is also the future, the future is also the past, the beginning also the end.
The dominance of linear time fits in with an eschatological worldview in which all of human history is building up to a final judgment. This is perhaps why, over time, it became the common-sense way of viewing time in the largely Christian west. When God created the world, he began a story with a beginning, a middle and an end. As Revelation puts it, while prophesying the end times, Jesus is this epic’s “Alpha and Omega, the beginning and the end, the first and the last”.
But there are other ways of thinking about time. Many schools of thought believe that the beginning and the end are and have always been the same because time is essentially cyclical. This is the most intuitively plausible way of thinking about eternity. When we imagine time as a line, we end up baffled: what happened before time began? How can a line go on without end? A circle allows us to visualise going backwards or forwards for ever, at no point coming up against an ultimate beginning or end.
Thinking of time cyclically especially made sense in premodern societies, where there were few innovations across generations and people lived very similar lives to those of their grandparents, their great-grandparents and going back many generations. Without change, progress was unimaginable. Meaning could therefore only be found in embracing the cycle of life and death and playing your part in it as best you could.
Confucius (551-479 BC). Photograph: Getty
Perhaps this is why cyclical time appears to have been the human default. The Mayans, Incans and Hopi all viewed time in this way. Many non-western traditions contain elements of cyclical thinking about time, perhaps most evident in classical Indian philosophy. The Indian philosopher and statesman Sarvepalli Radhakrishnan wrote: “All the [orthodox] systems accept the view of the great world rhythm. Vast periods of creation, maintenance and dissolution follow each other in endless succession.” For example, a passage in the Rig Veda addressing Dyaus and Prithvi (heaven and earth) reads: “Which was the former, which of them the latter? How born? O sages, who discerns? They bear themselves all that has existence. Day and night revolve as on a wheel.”
East Asian philosophy is deeply rooted in the cycle of the seasons, part of a larger cycle of existence. This is particularly evident in Taoism, and is vividly illustrated by the surprising cheerfulness of the 4th century BC Taoist philosopher Zhuangzi when everyone thought he should have been mourning for his wife. At first, he explained, he was as miserable as anyone else. Then he thought back beyond her to the beginning of time itself: “In all the mixed-up bustle and confusion, something changed and there was qi. The qi changed and there was form. The form changed and she had life. Today there was another change and she died. It’s just like the round of four seasons: spring, summer, autumn and winter.”
In Chinese thought, wisdom and truth are timeless, and we do not need to go forward to learn, only to hold on to what we already have. As the 19th- century Scottish sinologist James Legge put it, Confucius did not think his purpose was “to announce any new truths, or to initiate any new economy. It was to prevent what had previously been known from being lost.” Mencius, similarly, criticised the princes of his day because “they do not put into practice the ways of the ancient kings”. Mencius also says, in the penultimate chapter of the eponymous collection of his conversations, close to the book’s conclusion: “The superior man seeks simply to bring back the unchanging standard, and, that being correct, the masses are roused to virtue.” The very last chapter charts the ages between the great kings and sages.
A hybrid of cyclical and linear time operates in strands of Islamic thought. “The Islamic conception of time is based essentially on the cyclic rejuvenation of human history through the appearance of various prophets,” says Seyyed Hossein Nasr, professor emeritus of Islamic studies at George Washington University. Each cycle, however, also moves humanity forward, with each revelation building on the former – the dictation of the Qur’an to Muhammad being the last, complete testimony of God – until ultimately the series of cycles ends with the appearance of the Mahdi, who rules for 40 years before the final judgment.
The distinction between linear and cyclical time is therefore not always neat. The assumption of an either/or leads many to assume that oral philosophical traditions have straightforwardly cyclical conceptions of time. The reality is more complicated. Take Indigenous Australian philosophies. There is no single Australian first people with a shared culture, but there are enough similarities across the country for some tentative generalisations to be made about ideas that are common or dominant. The late anthropologist David Maybury-Lewis suggested that time in Indigenous Australian culture is neither cyclical nor linear; instead, it resembles the space-time of modern physics. Time is intimately linked to place in what he calls the “dreamtime” of “past, present, future all present in this place”.
“One lives in a place more than in a time,” is how Stephen Muecke puts it in his book Ancient and Modern: Time, Culture and Indigenous Philosophy. More important than the distinction between linear or cyclical time is whether time is separated from or intimately connected to place. Take, for example, how we conceive of death. In the contemporary west, death is primarily seen as the expiration of the individual, with the body as the locus, and the location of that body irrelevant. In contrast, Muecke says: “Many indigenous accounts of the death of an individual are not so much about bodily death as about a return of energy to the place of emanation with which it re-identifies.”
Such a way of thinking is especially alien to the modern west, where a pursuit of objectivity systematically downplays the particular, the specifically located. In a provocative and evocative sentence, Muecke says: “Let me suggest that longsightedness is a European form of philosophical myopia and that other versions of philosophy, indigenous perhaps, have a more lived-in and intimate association with societies of people and the way they talk about themselves.”
Muecke cites the Australian academic Tony Swain’s view that the concept of linear time is a kind of fall from place. “I’ve got a hunch that modern physics separated out those dimensions and worked on them, and so we produced time as we know it through a whole lot of experimental and theoretical activities,” Muecke told me. “If you’re not conceptually and experimentally separating those dimensions, then they would tend to flow together.” His indigenous friends talk less of time or place independently, but more of located events. The key temporal question is not “When did this happen?” but “How is this related to other events?”
That word related is important. Time and space have become theoretical abstractions in modern physics, but in human culture they are concrete realities. Nothing exists purely as a point on a map or a moment in time: everything stands in relation to everything else. So to understand time and space in oral philosophical traditions, we have to see them less as abstract concepts in metaphysical theories and more as living conceptions, part and parcel of a broader way of understanding the world, one that is rooted in relatedness. Hirini Kaa, a lecturer at the University of Auckland, says that “the key underpinning of Maori thought is kinship, the connectedness between humanity, between one another, between the natural environment”. He sees this as a form of spirituality. “The ocean wasn’t just water, it wasn’t something for us to be afraid of or to utilise as a commodity, but became an ancestor deity, Tangaroa. Every living thing has a life force.”
David Mowaljarlai, who was a senior lawman of the Ngarinyin people of Western Australia, once called this principle of connectivity “pattern thinking”. Pattern thinking suffuses the natural and the social worlds, which are, after all, in this way of thinking, part of one thing. As Muecke puts it: “The concept of connectedness is, of course, the basis of all kinship systems [...] Getting married, in this case, is not just pairing off, it is, in a way, sharing each other.”
The emphasis on connectedness and place leads to a way of thinking that runs counter to the abstract universalism developed to a certain extent in all the great written traditions of philosophy. Muecke describes as one of the “enduring [Indigenous Australian] principles” that “a way of being will be specific to the resources and needs of a time and place and that one’s conduct will be informed by responsibility specific to that place”. This is not an “anything goes” relativism, but a recognition that rights, duties and values exist only in actual human cultures, and their exact shape and form will depend on the nature of those situations.
This should be clear enough. But the tradition of western philosophy, in particular, has striven for a universality that glosses over differences of time and place. The word “university”, for example, even shares the same etymological root as “universal”. In such institutions, “the pursuit of truth recognises no national boundaries”, as one commentator observed. Place is so unimportant in western philosophy that, when I discovered it was the theme of the quinquennial East-West Philosophers’ Conference in 2016, I wondered if there was anything I could bring to the party at all. (I decided that the absence of place in western philosophy itself merited consideration.)
The universalist thrust has many merits. The refusal to accept any and every practice as a legitimate custom has bred a very good form of intolerance for the barbaric and unjust traditional practices of the west itself. Without this intolerance, we would still have slavery, torture, fewer rights for women and homosexuals, feudal lords and unelected parliaments. The universalist aspiration has, at its best, helped the west to transcend its own prejudices. At the same time, it has also legitimised some prejudices by confusing them with universal truths. The philosopher Kwame Anthony Appiah argues that the complaints of anti-universalists are not generally about universalism at all, but pseudo-universalism, “Eurocentric hegemony posing as universalism”. When this happens, intolerance for the indefensible becomes intolerance for anything that is different. The aspiration for the universal becomes a crude insistence on the uniform. Sensitivity is lost to the very different needs of different cultures at different times and places.
This “posing as universalism” is widespread and often implicit, with western concepts being taken as universal but Indian ones remaining Indian, Chinese remaining Chinese, and so on. To end this pretence, Jay L Garfield and Bryan W Van Norden propose that those departments of philosophy that refuse to teach anything from non-western traditions at least have the decency to call themselves departments of western philosophy.
The “pattern thinking” of Maori and Indigenous Australian philosophies could provide a corrective to the assumption that our values are the universal ones and that others are aberrations. It makes credible and comprehensible the idea that philosophy is never placeless and that thinking that is uprooted from any land soon withers and dies.
Mistrust of the universalist aspiration, however, can go too far. At the very least, there is a contradiction in saying there are no universal truths, since that is itself a universal claim about the nature of truth. The right view probably lies somewhere between the claims of naive universalists and those of defiant localists. There seems to be a sense in which even the universalist aspiration has to be rooted in something more particular. TS Eliot is supposed to have said: “Although it is only too easy for a writer to be local without being universal, I doubt whether a poet or novelist can be universal without being local, too.” To be purely universal is to inhabit an abstract universe too detached from the real world. But just as a novelist can touch on universals of the human condition through the particulars of a couple of characters and a specific story, so our different, regional philosophical traditions can shed light on more universal philosophical truths even though they approach them from their own specific angles.
We should not be afraid to ground ourselves in our own traditions, but we should not be bound by them. Gandhi put this poetically when he wrote: “I do not want my house to be walled in on all sides and my windows to be stuffed. I want the cultures of all lands to be blown about my house as freely as possible. But I refuse to be blown off my feet by any. I refuse to live in other people’s houses as an interloper, a beggar or a slave.”
In the west, the predominance of linear time is associated with the idea of progress that reached its apotheosis in the Enlightenment. Before this, argues the philosopher Anthony Kenny, “people looking for ideals had looked backwards in time, whether to the primitive church, or to classical antiquity, or to some mythical prelapsarian era. It was a key doctrine of the Enlightenment that the human race, so far from falling from some earlier eminence, was moving forward to a happier future.”
Kenny is expressing a popular view, but many see the roots of belief in progress deeper in the Christian eschatological religious worldview. “Belief in progress is a relic of the Christian view of history as a universal narrative,” claims John Gray. Secular thinkers, he says, “reject the idea of providence, but they continue to think humankind is moving towards a universal goal”, even though “the idea of progress in history is a myth created by the need for meaning”.
Whether faith in progress is an invention or an adaptation of the Enlightenment, the image of secular humanists naively believing humanity is on an irreversible, linear path of advancement seems to me a caricature of their more modest hope, based in history, that progress has occurred and that more is possible. As the historian Jonathan Israel says, Enlightenment ideas of progress “were usually tempered by a strong streak of pessimism, a sense of the dangers and challenges to which the human condition is subject”. He dismisses the idea that “Enlightenment thinkers nurtured a naive belief in man’s perfectibility” as a “complete myth conjured up by early 20th-century scholars unsympathetic to its claims”.
Nevertheless, Gray is right to point out that linear progress is a kind of default way of thinking about history in the modern west and that this risks blinding us to the ways in which gains can be lost, advances reversed. It also fosters a sense of the superiority of the present age over earlier, supposedly less advanced” times. Finally, it occludes the extent to which history doesn’t repeat itself but does rhyme.
The different ways in which philosophical traditions have conceived time turn out to be far from mere metaphysical curiosities. They shape the way we think about both our temporal place in history and our relation to the physical places in which we live. It provides one of the easiest and clearest examples of how borrowing another way of thinking can bring a fresh perspective to our world. Sometimes, simply by changing the frame, the whole picture can look very different.
Sunday, 20 May 2018
To Save Western Capitalism - Look East
Once upon a time, not so long ago, there was a place where peace and prosperity reigned. Let’s call this place the West. These lands had once been ravaged by bloody wars but its rulers had, since, solved the puzzle of perpetual progress and discovered a kind of political and economic elixir of life. Big Problems were relegated to either Somewhere Else (the East) or Another Time (History). The Westerners dutifully sent emissaries far and wide to spread the word that the secret of eternal bliss had been found –and were, themselves, to live happily ever after.
So ran, until very recently, the story of how the West was won.
The formula that had been discovered was simple: the recipe for a bright, shiny new brand of global capitalism based on liberal democracy and something called neoclassical economics. But it was different from previous eras – cleansed of Dickensian grime. The period after the two world wars was in many a Golden Age: the moment of Bretton Woods (that established the international monetary and financial order) and the Beveridge report (the blueprint for the welfare state), feminism and free love.
It was post-colonial, post-racial, post-gendered. It felt like you could have it all, material abundance and the moral revolutions; a world infinitely vulnerable to invention – but all without picking sides, all based on institutional equality of access. That’s how clever the scheme was – truly a brave new world. Fascism and class, slavery and genocide – no one doubted that, in the main, it had been left behind (or at least that we could all agree on its evils); that the wheels of history had permanently been set in motion to propel us towards a better future. The end of history, Francis Fukuyama called it – the zenith of human civilisation.
Austerity in Athens: the eurozone crisis hit Greece not once but twice (Getty)
While liberal democracy was the part of the programme that got slapped on to the brochure, it was a streamlined paradigm of neoclassical economics that provided the brains behind the enterprise. Neoclassical economics, scarred by war-era ideological acrimony, scrubbed the subject of all the messy stuff: politics, values – all the fluff. To do so it used a new secret weapon: quantitative precision unprecedented in the social sciences.
It didn’t rest on whimsical things like enlightened leadership or invested citizenship or compassionate communities. No, siree. It was pure science: a reliable, universally-applicable maximising equation for society (largely stripped of any contextual or, until recently, even cognitive considerations). Its particular magic trick was to be able to do good without requiring anyone to be good.
And, it was limitless in its capacity to turn boundless individual rationality into endless material wellbeing, to cull out of infinite resources (on a global scale) indefinite global growth. It presumed to definitively replace faltering human touch with the infallible “invisible hand” and, so, discourses of exploitation with those of merit.
When I started as a graduate student in the early 2000s, this model was at the peak of its powers: organised into an intellectual and policy assembly line that more or less ran the world. At the heart of this enterprise, in the unipolar post-Cold War order, was what was known as the Chicago school of law and economics. The Chicago school boiled the message of neoclassical economics down to a simpler formula still: the American Dream available for export – just add private property and enforceable contracts. Anointed with a record number of Nobel prizes, its message went straight to the heart of Washington DC, and from there – via its apostles, the International Monetary Fund and the World Bank radiated out to the rest of the globe.
It was like the social equivalent of the Genome project. Sure the model required the odd tweak, the ironing out of the occasional glitch but, for the most part, the code had been cracked. So, like the ladies who lunch, scholarly attention in the West turned increasingly to good works and the fates of “the other” – spatially and temporally.
One strain led to a thriving industry in development: these were the glory days of tough love, and loan conditionalities. The message was clear: if you want Western-style growth, get with the programme. The polite term for it was structural adjustment and good governance: a strict regime of purging what Max Weber had called mysticism and magic, and swapping it for muscular modernisation. Titles like Daron Acemoglu and James Robinson’s Why Nations Fail: The Origins of Power, Prosperity and Poverty and Hernando de Soto’s The Mystery of Capital: Why Capitalism Triumphs in the West and Fails Everywhere Else jostled for space on shelves of bookstores and best-seller lists.
Another led to esoteric islands of scholarship devoted to atonement for past sins. On the US side of the Atlantic, post-colonial scholarship gained a foothold, even if somewhat limp. In America, slavery has been, for a while now, the issue a la mode. A group of Harvard scholars has been taking a keen interest in the “history of capitalism”.
Playing intellectual archaeologists, they’ve excavated the road that led to today’s age of plenty – leaving in its tracks a blood-drenched path of genocide, conquest, and slavery. The interest that this has garnered, for instance Sven Beckert’s Empire of Cotton, is heartening, but it has been limited to history (or worse still, “area studies”) departments rather than economics, and focused on the past not the present.
As far back as the fifties ... the economy was driving the society, when it should have been the other way around (Getty)
Indeed, from the perspective of Western scholarship, the epistemic approach to the amalgam of these instances has been singularly inspired by Indiana Jones – dismissed either as curious empirical aberrations or distanced by the buffer of history. By no means the stuff of mainstream economic theory.
Some of us weakly cleared our throats and tried to politely intercede that there were, all around us, petri dishes of living, breathing data on not just development – but capitalism itself. Maybe, just maybe – could it be? – that if the template failed to work in a large majority of cases around the globe that there may be a slight design error.
My longtime co-author, Nobel Prize winner in Economics and outspoken critic, Joseph Stiglitz, in his 1990 classic Globalisation and Its Discontents chronicled any number of cases of leaching-like brutality of structural adjustment. The best-selling author and my colleague at Cambridge, Ha-Joon Chang, in Kicking Away the Ladder, pointed out that it was a possibility that the West was misremembering the trajectory of its own ascent to power – that perhaps it was just a smidgen more fraught than it remembered, that maybe the State had had a somewhat more active part to play before it retired from the stage.
But a bright red line separated the “us” from the “them” enforcing a system of strict epistemic apartheid. Indeed, as economics retreated further and further into its silo of smugness, economics departments largely stopped teaching economic history or sociology and development economics clung on by operating firmly within the discipline-approved methodology.
So far, so good – a bit like the last night of merriment on the Titanic. Then the iceberg hit.
Suddenly the narratives that were comfortably to do with the there and then became for the Western world a matter of the here and now.
In the last 10 years, what economic historian Robert Skidelsky recently referred to as the “lost decade” for the advanced industrial West, problems that were considered the exclusive preserve of development theory – declining growth, rampant inequality, failing institutions, a fractured political consensus, corruption, mass protests and poverty – started to be experienced on home turf.
The Great Recession starting 2008 should really have been the first hint: foreclosures and evictions, bankruptcies and bailouts, crashed stock markets. Then came the eurozone crisis starting in 2009 (first Greece; then Ireland; then Portugal; then Spain; then Cyprus; oh, and then Greece, again). But after an initial scare, it was largely business as usual – written off as an inevitable blip in the boom-bust logic of capitalist cycles. It was 2016, the year the world went mad, that made the writing on the wall impossible to turn away from – starting with the shock Brexit vote, and then the Trump election. Not everyone understands what a CDO (collateralised debt obligation) is, but the vulgarity of a leader of the free world who governs by tweet and “grabs pussy” is hard to miss.
So how did it happen, this unexpected epilogue to the end of history?
I hate to say I told you so, but some of us had seen this coming – the twist in the tale, foreshadowed by an eerie background score lurking behind the clinking of champagne glasses. Even at the height of the glory days. In the summer before the fall of Lehman Brothers, a group of us “heterodox economists” had gathered at a summer school in the North of England. We felt like the audience at a horror movie – knowing that the gory climax was moments away while the victim remained blissfully impervious.
The plot wasn’t just predictable, it was in the script for anyone to see. You just had to look closely at the fine print.
In particular, you needed to have read your Karl Polanyi, the economic sociologist, who predicted this crisis over 50 years ago. As far back as 1954, The Great Transformation diagnosed the central perversion of the capitalist system, the inversion that makes the person less important than the thing – the economy driving society, rather than the other way around.
Polanyi’s point was simple: if you turned all the things that people hold sacred into grist to the mill of a large impersonal economic machinery (he called this disembedding) there would be a backlash. That the fate of a world where monopoly money reigns supreme and human players are reduced to chessmen at its mercy is doomed. The sociologist Fred Block compares this to the stretching of a giant elastic band – either it reverts to a more rooted position, or it snaps.
It is this tail-wagging-the-dog quality that is driving the current crisis of capitalism. It’s a matter of existential alienation. This problem of artificial abstraction runs through the majority of upheavals of our age – from the financial crisis to Facebook. So cold were the nights in this era of enforced neutrality that the torrid affair between liberal democracy and neoclassical economics resulted in the most surprising love child – populism.
The simple fact is that after decades on promises not delivered on, the system had written just one too many cheques that couldn’t be cashed. And people had had just about enough.
The Brexit vote in 2016 followed by the election of Trump ... and the world had finally gone mad (Getty)
The old fault lines of global capitalism, the East versus West dynamics of the World Trade Organisation’s Doha Round, turned out to be red herrings. The axis that counts is the system versus the little people. Indeed, the anatomies of annihilation look remarkably similar across the globe – whether it’s the loss of character of a Vanishing New York or Disappearing London, or threatened communities of farmers in India and fishermen in Greece.
Trump voters in the US, Brexiteers in Britain and Modi supporters in India seek identity – any identity, even a made-up call to arms to “return” to mythical past greatness in the face of the hollowing out of meaning of the past 70 years. The rise of populism is, in many ways, the death cry of populations on the verge of extinction – yearning for something to believe in when their gods have died young. It’s a problem of the 1 per cent – poised to control two-thirds of the world economy by 2030 – versus the 99 per cent. But far more pernicious is the Frankenstein’s monster that is the idea of an economic system that is an end in itself.
Not to be too much of a conspiracy theorist about it, but the current system doesn’t work because it wasn’t meant to – it was rigged from the start. Wealth was never actually going to “trickle down”. Thomas Piketty did the maths.
Suddenly, the alarmist calls of the developmentalists objecting to the systemic skews in the process of globalisation don’t seem quite so paranoid.
But this is more than “poco” (what the cool kids call postcolonialism) schadenfreude. My point is a serious one; although I would scarcely have dared articulate it before now. Could Kipling have been wrong, and might the East have something to offer the mighty West? Could the experiences of exotic lands point the way back to the future? Could it be, could it just, that it may even be a source of epistemic wisdom?
Behind the scaffolding of Xi Jinping’s China or Narendra Modi’s India, sites of capitalism under construction, we are offered a glimpse into the system’s true nature. It is not God-given, but the product of highly political choices. Just like Jane Jacobs protesting to save Washington Square Park or Beatrix Potter devoting the bulk of her royalty earnings to conserving the Lake District were choices. But these cases also show that trust and community are important. The incredible resilience of India’s jugaad economy, or the critical role of quanxi in the creation of the structures in what has been for the past decade the world’s fastest-growing nation, China. A little mysticism and magic may be just the thing.
The narrative that we need is less that of Frankenstein’s man-loses-control of monster, and more that of Pinocchio’s toy-becomes-real-boy-by-acquiring-conscience; less technology, and more teleology. The real limit may be our imaginations. Perhaps the challenge is to do for scholarship, what Black Panther has done for Hollywood. You never know. Might be a blockbuster.
Thursday, 1 August 2013
Audi Drivers
Girish Menon
1/08/2013
Your speed limit is @ seventy
To protect each road user
But I'll drive my Audi @ one sixty
Coz I am not a loser
The judge banned me for three years
And fined me five one five
Fine, I said, will you take cash
It's time for my next drive
None of your laws will stop me
I've got more cash than you think
I can buy your cops and judges
With my tax haven market riches
Your income tax is @ 45 percent
To protect every lazy Briton
But I will pay @ zero percent
Coz I am not a cretin
The taxman may audit my books
To get cash for the treasury
His mates help me cook the books
And I won't share my luxury
None of your laws will stop me
I've got more cash than you think
I can buy your rulers and judges
With my tax haven market riches