Search This Blog

Monday, 9 March 2015

Invasion of the algorithms: The modern-day equations which can rule our lives

Rhodri Marsden in The Independent

“This is a miracle of modern technology,” says dating-agency proprietor Sid Bliss, played by Sid James, in the 1970 comedy film Carry On Loving. “All we do is feed the information into the computer here, and after a few minutes the lady suitable comes out there,” he continues, pointing to a slot.

There’s the predictable joke about the slot being too small, but Sid’s client is mightily impressed by this nascent display of computer power. He has faith in the process, and is willing to surrender meekly to whatever choices the machine makes. The payoff is that the computer is merely a facade; on the other side of the wall, Sid’s wife (played by Hattie Jacques) is processing the information using her own, very human methods, and bunging a vaguely suitable match back through the slot. The clients, however, don’t know this. They think it’s brilliant.

Technology has come a long way since Sid James delivered filthy laughs into a camera lens, but our capacity to be impressed by computer processes we know next to nothing about remains enormous. All that’s changed is the language: it’s now the word “algorithm” that makes us raise our eyebrows appreciatively and go “oooh”. It’s a guaranteed way of grabbing our attention: generate some findings, attribute them to an algorithm, and watch the media and the public lap them up.

“Apothic Red Wine creates a unique algorithm to reveal the ‘dark side’ of the nation’s personas,” read a typical press release that plopped into hundreds of email inboxes recently; Yahoo, the Daily Mirror, Daily Mail and others pounced upon it and uncritically passed on the findings. The level of scientific rigour behind Apothic’s study was anyone’s guess – but that didn’t matter because the study was powered by an algorithm, so it must be true.

The next time we’re about to be superficially impressed by the unveiling of a “special algorithm”, it’s worth remembering that our lives have been ruled by them since the year dot and we generate plenty ourselves every day. Named after the eminent Persian mathematician Muhammad ibn Musa Al-Khwarizmi, algorithms are merely sets of instructions for how to achieve something; your gran’s chocolate-cake recipe could fall just as much into the algorithm category as any computer program. And while they’re meant to define sequences of operations very precisely and solve problems very efficiently, they come with no guarantees. There are brilliant algorithms and there are appalling algorithms; they could easily be riddled with flawed reasoning and churn out results that raise as many questions as they claim to answer. 

This matters, of course, because we live in an information age. Data is terrifyingly plentiful; it’s piling up at an alarming rate and we have to outsource the handling of that data to algorithms if we want to avoid a descent into chaos. We trust sat-nav applications to pull together information such as length of road, time of day, weight of traffic, speed limits and road blocks to generate an estimate of our arrival time; but their accuracy is only as good as the algorithm. Our romantic lives are, hilariously, often dictated by online-dating algorithms that claim to generate a “percentage match” with other human beings.

Our online purchases of everything from vacuum cleaners to music downloads are affected by algorithms. If you’re reading this piece online, an algorithm will have probably brought it to your attention. We’re marching into a future where our surroundings are increasingly shaped, in real time, by mathematics. Mentally, we’re having to adjust to this; we know that it’s not a human being at Netflix or Apple suggesting films for us to watch, but perhaps the algorithm does a better job. Google’s adverts can seem jarring – trying to flog us products that we have just searched for – precisely because algorithms tailor them to our interests far better than a human ever could.

With data being generated by everything from England’s one-day cricket team to your central heating system, the truth is that algorithms beat us hands down at extrapolating meaning.

“This has been shown to be the case on many occasions,” says data scientist Duncan Ross, “and that’s for obvious reasons. The sad reality is that humans are a basket of biases which we build up over our lives. Some of them are sensible; many of them aren’t. But by using data and learning from it, we can reduce those biases.” 


In the financial markets, where poor human judgement can lead to eye-watering losses, the vast majority of transactions are now outsourced to algorithms which can react within microseconds to the actions of, well, other algorithms. They’ve had a place in the markets ever since Thomas Peterffy made a killing in the 1980s by using them to detect mispriced stock options (a story told in fascinating detail in the book Automate This by Christopher Steiner), but today data science drives trade. Millions of dollars’ worth of stocks change hands, multiple times, before one trader  can shout “sell!”.

We humans have to accept that algorithms can make us look comparatively useless (except when they cause phenomena like Wall Street’s “flash crash” of 2010, when the index lost  1,000 points in a day, before recovering). But that doesn’t necessarily feel like a good place to be.

The increasing amount of donkey work undertaken by algorithms represents a significant shift in responsibility, and by association a loss of control. Data is power, and when you start to consider all the ways in which our lives are affected by the processing of said data, it can feel like a dehumanising step. Edward Snowden revealed the existence of an algorithm to determine whether or not you were a US citizen; if you weren’t, you could be monitored without a warrant. But even aside from the plentiful security and privacy concerns, other stuff is slipping underneath our radar, such as the homogenisation of culture; for many years, companies working in the film and music industry have used algorithms to process scripts and compositions to determine whether they’re worth investing in. Creative ventures that don’t fit the pattern are less likely to come to fruition. The algorithms forged by data scientists, by speeding up processes and saving money, have a powerful, direct impact on all of us.

Little wonder that the Government is taking a slightly belated interest. Last year Vince Cable, the Business Secretary, announced £42m of funding for a new body, the Alan Turing Institute, which is intended to position the UK as a world leader in algorithm research.

The five universities selected to lead that institute (Cambridge, Edinburgh, Oxford, Warwick and UCL) were announced last month; they will lead the efforts to tame and use what’s often referred to as Big Data.

“So many disciplines are becoming dependent upon it, including engineering, science, commerce and medicine,” says Professor Philip Nelson, chief executive of the Engineering and Physical Sciences Research Council, the body co-ordinating the institute’s output. “It was felt very important that we put together a national capability to help in the analysis and interpretation of that data. The idea is to pull together the very best scientists to do the fundamental work in maths and data science to underpin all these activities.”

But is this an attempt to reassert control over a sector that’s wielding an increasing amount of power?

“Not at all,” says Nelson. “More than anything else, it’s about making computers more beneficial to society by using the data better.”

On the one hand we see algorithms used to do pointless work (“the most-depressing day of the year” simply does not exist); on the other we’re told to fear subjugation to our computer overlords. But it’s easy to forget the power of the algorithm to do good.

Duncan Ross is one of the founder directors of DataKind UK, a charity that helps other charities make the best use of the data at their disposal.

“We’re in this world of constrained resources,” he says, “and we can ill afford for charities to be doing things that are ineffective.”

From weekend “datathons” to longer-term, six-month projects, volunteers help charities to solve a range of problems.

“For example,” says Ross, “we did some recent work with Citizens Advice, who have a lot of data coming in from their bureaux.



“They’re keen to know what the next big issue is and how they can spot it quickly; during the payday-loans scandal they felt that they were pretty late to the game, because even though they were giving advice, they were slow to take corporate action. So we worked with them on algorithms that analyse the long-form text reports written by local teams in order to spot new issues more quickly.

“We’re not going to solve all the charities’ problems; they’re the experts working on the ground. What we can do is take their data and help them arrive at better decisions.”

Data sets can be investigated in unexpected ways to yield powerful results. For example, Google has developed a way of aggregating users’ search data to spot flu outbreaks.

“That flu algorithm [Google Flu Trends] picked up on people searching for flu remedies or symptoms,” says Ross, “and by itself it seemed to be performing about as well as the US Centers for Disease Control. If you take the output of that algorithm and use it as part of the decision-making process for doctors, then we really get somewhere.”

But Google, of course, is a private company with its own profit motives, and this provokes another algorithmic fear; that Big Data is being processed by algorithms that might not be working in our best interests. We have no way of knowing; we feel far removed from these processes that affect us day to day.

Ross argues that it’s perfectly normal for us to have little grasp of the work done by scientists.

“How much understanding is there of what they actually do at Cern?” he asks. “The answer is almost none. Sometimes, with things like the Higgs boson, you can turn it into a story where, with a huge amount of anecdote, you can just about make it exciting and interesting – but it’s still a challenge.

“As far as data is concerned, the cutting-edge stuff is a long way from where many organisations are; what they need to be doing is much, much more basic. But there are areas where there are clearly huge opportunities.”

That’s an understatement. As the so-called “internet of things” expands, billions of sensors will surround us, each of them a data point, each of them with algorithmic potential. The future requires us to place enormous trust in data scientists; just like the hopeful romantic in Carry On Loving, we’ll be keeping our fingers crossed that the results emerging from the slot are the ones we’re after.

We’ll also be keeping our fingers crossed that the processes going on out of sight, behind that wall, aren’t overseen by the algorithmic equivalent of Sid James and Hattie Jacques.


Here’s hoping.

Friday, 6 March 2015

Chapter 11 comes to India

Pritish Nandy in the Times of India
One of the best things in last week’s Union Budget, which has gone largely unnoticed, is the finance minister’s pledge to bring in a comprehensive Bankruptcy Code. Bankruptcy law reform is now a priority for improving the ease of doing business, said Arun Jaitley, thus telling us for the first time that the government has finally come to accept the fact that shit happens. And it’s time that we, as a nation, realized this and found ways and means to deal with it.
Till now, every failure was chased by a lynch mob hardwired to believe that failure is deliberate and must be punished. Not only countless lives and careers have been destroyed by this attitude but it has also fostered a business climate where people either stay away from taking the kind of crucial risks businessmen ought to take or, worse, it has brought risk taking and failure (which are at the heart of all serious entrepreneurship) into unnecessary disrepute. We, as a people, actually believe that every business that fails is a deliberate deep-rooted conspiracy, a plan to loot others. In this perverse worldview, we ignore the simple fact that most bankruptcies owe their origins to Black Swan events that have become increasingly commonplace. Not the greed and wickedness of businessmen.
History shows that the best businessmen go through many failures. They may not always talk about them but these failures teach them the lessons that eventually make them successful. The very failures we despise are the bedrock on which shining empires are built. Bankruptcy, or Chapter 11 as the Americans love to call it, is hardly a dirty word in today’s business scenario, where everything changes all the time, abruptly and without any notice. In fact, failure is a badge of honour that many successful entrepreneurs openly wear. For who will ever risk investing in a business where the promoter claims he has never known failure?
Hiding failure, in fact, is the worst thing one can do. It causes all round damage. Acknowledging it and then finding ways and means to mitigate it and move on is the way all civilised societies deal with failure. As Nassim Nicholas Taleb, my favourite economist recently said, failure is the only real asset of a nation and knowing how to fail is its biggest talent. Taleb also added that failure may be the best mantra for India’s success. For a nation that has not experienced failure and learnt from it is hugely handicapped in today’s world where everything changes at short notice, including the nature of risks. A nation that turns away from risk is not a nation yet ready for success.
He cites the examples of France and Japan. Their economies are doing poorly, Taleb argues, because the failure rate is so low. In the US, on the other hand, the highest fail rate is in California which also has the most inspiring success stories. Walt Disney is an example. He was fired by his editor because he “lacked imagination and had no good ideas”. He went bankrupt several times before he built Disneyland. In fact, even the proposal for Disneyland was rejected by the city of Anaheim on the ground that it would attract only riffraff. Henry Ford went bankrupt before he could steer Ford Motors to its huge success. So did HJ Heinz, founder of Heinz. And William Durant who created General Motors. And Milton Hershey, founder of Hershey Chocolates. The day Trump Towers was being announced with huge fanfare in Mumbai I read that Trump Taj Mahal in Atlantic City had gone belly up.
Business is not about not taking risks. It’s about riding the right risks to build institutions and create wealth. Sports, media, entertainment have had its share of bankrupts. From Larry King to Francis Ford Coppola to rapper MC Hammer to Stan Lee, founder of Marvel Comics, to blogger Perez Hilton to Mick Fleetwood, and Bob Guccione, founder of Penthouse, all have faced bankruptcy. Even famous US Presidents have. Abraham Lincoln, Ulysses S Grant, William McKinley, Thomas Jefferson. In recent years, Steve Jobs went almost bankrupt. So did Apple. Today it’s the world’s richest, strongest brand, seemingly indestructible.
Restaurants improve every time they fail. So do cars, trains, planes. They become safer because we always over-compensate after a disaster. Every shock strengthens us, readies us better for the future. Businesses too are like that and I am glad the Finance Minister has realised it and removed the stigma.
Have I ever gone bankrupt? No, but I have teetered on the edge often enough and never been embarrassed to admit it.
Funnily, as Taleb points out, the only business that never learns from failure is banks. When a bank crashes today, the probability of a bank crashing tomorrow actually increases. Banking is clearly not a business that learns from its mistakes. History proves that too.

Thursday, 5 March 2015

Why you're almost certainly more like your father than your mother

The Independent 

Genes from your father are more dominant than those inherited from your mother, new research has shown.

All mammals are likely to use the majority of genetic material passed down from males, even if offspring look and act more like the mother, according to the study on lab mice by University of North Carolina’s School of Medicine.

This means that even though we inherit an equal amount of DNA from each parent, the paternal line is mostly found to govern how a person develops into an adult – especially in regards to their health.

The findings could give scientists more insight into how diseases and conditions are caused by the expression of thousands of genes, of which several hundred imprinted genes – rather than out of the 95 initially thought – could be in favour of the father.
Professor and author of the study paper Fernando Pardo-Manuel de Villena said: “This is an exceptional new research finding that opens the door to an entirely new area of exploration in human genetics.”

The study on the offspring of three genetically-diverse strains of “Collaborative Cross” mice is hoped to shed light on how mutations show up in complex diseases such as diabetes, heart disease, schizophrenia and obesity, according to Science Daily

James Crowley, assistant professor of genetics, selected strains of mice that descended from a subspecies that evolved on different continents and each type was used as both father and mother.

When the nine baby mice reached adulthood, the researchers measured gene expression in four different kinds of tissue, including RNA sequencing in the brain.
“This expression level is dependent on the mother or the father,” Pardo-Manuel de Villena said.

“We now know that mammals express more genetic variance from the father. So imagine that a certain kind of mutation is bad. If inherited from the mother, the gene wouldn't be expressed as much as it would be if it were inherited from the father.

“So, the same bad mutation would have different consequences in disease if it were inherited from the mother or from the father.”


The study is published in the journal Nature Genetics.

Wednesday, 4 March 2015

The East India Company: The original corporate raiders

William Dalrymple in The Guardian

One of the very first Indian words to enter the English language was the Hindustani slang for plunder: “loot”. According to the Oxford English Dictionary, this word was rarely heard outside the plains of north India until the late 18th century, when it suddenly became a common term across Britain. To understand how and why it took root and flourished in so distant a landscape, one need only visit Powis Castle.

The last hereditary Welsh prince, Owain Gruffydd ap Gwenwynwyn, built Powis castle as a craggy fort in the 13th century; the estate was his reward for abandoning Wales to the rule of the English monarchy. But its most spectacular treasures date from a much later period of English conquest and appropriation: Powis is simply awash with loot from India, room after room of imperial plunder, extracted by the East India Company in the 18th century.

There are more Mughal artefacts stacked in this private house in the Welsh countryside than are on display at any one place in India – even the National Museum in Delhi. The riches include hookahs of burnished gold inlaid with empurpled ebony; superbly inscribed spinels and jewelled daggers; gleaming rubies the colour of pigeon’s blood and scatterings of lizard-green emeralds. There are talwars set with yellow topaz, ornaments of jade and ivory; silken hangings, statues of Hindu gods and coats of elephant armour.

Such is the dazzle of these treasures that, as a visitor last summer, I nearly missed the huge framed canvas that explains how they came to be here. The picture hangs in the shadows at the top of a dark, oak-panelled staircase. It is not a masterpiece, but it does repay close study. An effete Indian prince, wearing cloth of gold, sits high on his throne under a silken canopy. On his left stand scimitar and spear carrying officers from his own army; to his right, a group of powdered and periwigged Georgian gentlemen. The prince is eagerly thrusting a scroll into the hands of a statesmanlike, slightly overweight Englishman in a red frock coat.

The painting shows a scene from August 1765, when the young Mughal emperor Shah Alam, exiled from Delhi and defeated by East India Company troops, was forced into what we would now call an act of involuntary privatisation. The scroll is an order to dismiss his own Mughal revenue officials in Bengal, Bihar and Orissa, and replace them with a set of English traders appointed by Robert Clive – the new governor of Bengal – and the directors of the EIC, who the document describes as “the high and mighty, the noblest of exalted nobles, the chief of illustrious warriors, our faithful servants and sincere well-wishers, worthy of our royal favours, the English Company”. The collecting of Mughal taxes was henceforth subcontracted to a powerful multinational corporation – whose revenue-collecting operations were protected by its own private army.

It was at this moment that the East India Company (EIC) ceased to be a conventional corporation, trading and silks and spices, and became something much more unusual. Within a few years, 250 company clerks backed by the military force of 20,000 locally recruited Indian soldiers had become the effective rulers of Bengal. An international corporation was transforming itself into an aggressive colonial power.

Using its rapidly growing security force – its army had grown to 260,000 men by 1803 – it swiftly subdued and seized an entire subcontinent. Astonishingly, this took less than half a century. The first serious territorial conquests began in Bengal in 1756; 47 years later, the company’s reach extended as far north as the Mughal capital of Delhi, and almost all of India south of that city was by then effectively ruled from a boardroom in the City of London. “What honour is left to us?” asked a Mughal official named Narayan Singh, shortly after 1765, “when we have to take orders from a handful of traders who have not yet learned to wash their bottoms?”

We still talk about the British conquering India, but that phrase disguises a more sinister reality. It was not the British government that seized India at the end of the 18th century, but a dangerously unregulated private company headquartered in one small office, five windows wide, in London, and managed in India by an unstable sociopath – Clive.

In many ways the EIC was a model of corporate efficiency: 100 years into its history, it had only 35 permanent employees in its head office. Nevertheless, that skeleton staff executed a corporate coup unparalleled in history: the military conquest, subjugation and plunder of vast tracts of southern Asia. It almost certainly remains the supreme act of corporate violence in world history. For all the power wielded today by the world’s largest corporations – whether ExxonMobil, Walmart or Google – they are tame beasts compared with the ravaging territorial appetites of the militarised East India Company. Yet if history shows anything, it is that in the intimate dance between the power of the state and that of the corporation, while the latter can be regulated, it will use all the resources in its power to resist.

When it suited, the EIC made much of its legal separation from the government. It argued forcefully, and successfully, that the document signed by Shah Alam – known as the Diwani – was the legal property of the company, not the Crown, even though the government had spent a massive sum on naval and military operations protecting the EIC’s Indian acquisitions. But the MPs who voted to uphold this legal distinction were not exactly neutral: nearly a quarter of them held company stock, which would have plummeted in value had the Crown taken over. For the same reason, the need to protect the company from foreign competition became a major aim of British foreign policy.


FacebookTwitterPinterest Robert Clive, was an unstable sociopath who led the fearsome East India Company to its conquest of the subcontinent. Photograph: Hulton Archive/Hulton Archive/Getty Images

The transaction depicted in the painting was to have catastrophic consequences. As with all such corporations, then as now, the EIC was answerable only to its shareholders. With no stake in the just governance of the region, or its long-term wellbeing, the company’s rule quickly turned into the straightforward pillage of Bengal, and the rapid transfer westwards of its wealth.

Before long the province, already devastated by war, was struck down by the famine of 1769, then further ruined by high taxation. Company tax collectors were guilty of what today would be described as human rights violations. A senior official of the old Mughal regime in Bengal wrote in his diaries: “Indians were tortured to disclose their treasure; cities, towns and villages ransacked; jaghires and provinces purloined: these were the ‘delights’ and ‘religions’ of the directors and their servants.”

Bengal’s wealth rapidly drained into Britain, while its prosperous weavers and artisans were coerced “like so many slaves” by their new masters, and its markets flooded with British products. A proportion of the loot of Bengal went directly into Clive’s pocket. He returned to Britain with a personal fortune – then valued at £234,000 – that made him the richest self-made man in Europe. After the Battle of Plassey in 1757, a victory that owed more to treachery, forged contracts, bankers and bribes than military prowess, he transferred to the EIC treasury no less than £2.5m seized from the defeated rulers of Bengal – in today’s currency, around £23m for Clive and £250m for the company.

No great sophistication was required. The entire contents of the Bengal treasury were simply loaded into 100 boats and punted down the Ganges from the Nawab of Bengal’s palace to Fort William, the company’s Calcutta headquarters. A portion of the proceeds was later spent rebuilding Powis.

The painting at Powis that shows the granting of the Diwani is suitably deceptive: the painter, Benjamin West, had never been to India. Even at the time, a reviewer noted that the mosque in the background bore a suspiciously strong resemblance “to our venerable dome of St Paul”. In reality, there had been no grand public ceremony. The transfer took place privately, inside Clive’s tent, which had just been erected on the parade ground of the newly seized Mughal fort at Allahabad. As for Shah Alam’s silken throne, it was in fact Clive’s armchair, which for the occasion had been hoisted on to his dining room table and covered with a chintz bedspread.

Later, the British dignified the document by calling it the Treaty of Allahabad, though Clive had dictated the terms and a terrified Shah Alam had simply waved them through. As the contemporary Mughal historian Sayyid Ghulam Husain Khan put it: “A business of such magnitude, as left neither pretence nor subterfuge, and which at any other time would have required the sending of wise ambassadors and able negotiators, as well as much parley and conference with the East India Company and the King of England, and much negotiation and contention with the ministers, was done and finished in less time than would usually have been taken up for the sale of a jack-ass, or a beast of burden, or a head of cattle.”

By the time the original painting was shown at the Royal Academy in 1795, however, no Englishman who had witnessed the scene was alive to point this out. Clive, hounded by envious parliamentary colleagues and widely reviled for corruption, committed suicide in 1774 by slitting his own throat with a paperknife some months before the canvas was completed. He was buried in secret, on a frosty November night, in an unmarked vault in the Shropshire village of Morton Say. Many years ago, workmen digging up the parquet floor came across Clive’s bones, and after some discussion it was decided to quietly put them to rest again where they lay. Here they remain, marked today by a small, discreet wall plaque inscribed: “PRIMUS IN INDIS.”

Today, as the company’s most articulate recent critic, Nick Robins, has pointed out, the site of the company’s headquarters in Leadenhall Street lies underneath Richard Rogers’s glass and metal Lloyd’s building. Unlike Clive’s burial place, no blue plaque marks the site of what Macaulay called “the greatest corporation in the world”, and certainly the only one to equal the Mughals by seizing political power across wide swaths of south Asia. But anyone seeking a monument to the company’s legacy need only look around. No contemporary corporation could duplicate its brutality, but many have attempted to match its success at bending state power to their own ends.

The people of Allahabad have also chosen to forget this episode in their history. The red sandstone Mughal fort where the treaty was extracted from Shah Alam – a much larger fort than those visited by tourists in Lahore, Agra or Delhi – is still a closed-off military zone and, when I visited it late last year, neither the guards at the gate nor their officers knew anything of the events that had taken place there; none of the sentries had even heard of the company whose cannons still dot the parade ground where Clive’s tent was erected.

Instead, all their conversation was focused firmly on the future, and the reception India’s prime minister, Narendra Modi, had just received on his trip to America. One of the guards proudly showed me the headlines in the local edition of the Times of India, announcing that Allahabad had been among the subjects discussed in the White House by Modi and President Obama. The sentries were optimistic. India was finally coming back into its own, they said, “after 800 years of slavery”. The Mughals, the EIC and the Raj had all receded into memory and Allahabad was now going to be part of India’s resurrection. “Soon we will be a great country,” said one of the sentries, “and our Allahabad also will be a great city.”
***

At the height of the Victorian period there was a strong sense of embarrassment about the shady mercantile way the British had founded the Raj. The Victorians thought the real stuff of history was the politics of the nation state. This, not the economics of corrupt corporations, they believed was the fundamental unit of analysis and the major driver of change in human affairs. Moreover, they liked to think of the empire as a mission civilisatrice: a benign national transfer of knowledge, railways and the arts of civilisation from west to east, and there was a calculated and deliberate amnesia about the corporate looting that opened British rule in India.

A second picture, this one commissioned to hang in the House of Commons, shows how the official memory of this process was spun and subtly reworked. It hangs now in St Stephen’s Hall, the echoing reception area of parliament. I came across it by chance late this summer, while waiting there to see an MP.

The painting was part of a series of murals entitled the Building of Britain. It features what the hanging committee at the time regarded as the highlights and turning points of British history: King Alfred defeating the Danes in 877, the parliamentary union of England and Scotland in 1707, and so on. The image in this series which deals with India does not, however, show the handing over of the Diwani but an earlier scene, where again a Mughal prince is sitting on a raised dais, under a canopy. Again, we are in a court setting, with bowing attendants on all sides and trumpets blowing, and again an Englishman is standing in front of the Mughal. But this time the balance of power is very different.

Sir Thomas Roe, the ambassador sent by James I to the Mughal court, is shown appearing before the Emperor Jahangir in 1614 – at a time when the Mughal empire was still at its richest and most powerful. Jahangir inherited from his father Akbar one of the two wealthiest polities in the world, rivalled only by Ming China. His lands stretched through most of India, all of what is now Pakistan and Bangladesh, and most of Afghanistan. He ruled over five times the population commanded by the Ottomans – roughly 100 million people. His capitals were the megacities of their day.

In Milton’s Paradise Lost, the great Mughal cities of Jahangir’s India are shown to Adam as future marvels of divine design. This was no understatement: Agra, with a population approaching 700,000, dwarfed all of the cities of Europe, while Lahore was larger than London, Paris, Lisbon, Madrid and Rome combined. This was a time when India accounted for around a quarter of all global manufacturing. In contrast, Britain then contributed less than 2% to global GDP, and the East India Company was so small that it was still operating from the home of its governor, Sir Thomas Smythe, with a permanent staff of only six. It did, however, already possess 30 tall ships and own its own dockyard at Deptford on the Thames.


FacebookTwitterPinterest An East India Company grandee. Photograph: Getty Images

Jahangir’s father Akbar had flirted with a project to civilise India’s European immigrants, whom he described as “an assemblage of savages”, but later dropped the plan as unworkable. Jahangir, who had a taste for exotica and wild beasts, welcomed Sir Thomas Roe with the same enthusiasm he had shown for the arrival of the first turkey in India, and questioned Roe closely on the distant, foggy island he came from, and the strange things that went on there.

For the committee who planned the House of Commons paintings, this marked the beginning of British engagement with India: two nation states coming into direct contact for the first time. Yet, in reality, British relations with India began not with diplomacy and the meeting of envoys, but with trade. On 24 September, 1599, 80 merchants and adventurers met at the Founders Hall in the City of London and agreed to petition Queen Elizabeth I to start up a company. A year later, the Governor and Company of Merchants trading to the East Indies, a group of 218 men, received a royal charter, giving them a monopoly for 15 years over “trade to the East”.

The charter authorised the setting up of what was then a radical new type of business: not a family partnership – until then the norm over most of the globe – but a joint-stock company that could issue tradeable shares on the open market to any number of investors, a mechanism capable of realising much larger amounts of capital. The first chartered joint-stock company was the Muscovy Company, which received its charter in 1555. The East India Company was founded 44 years later. No mention was made in the charter of the EIC holding overseas territory, but it did give the company the right “to wage war” where necessary.

Six years before Roe’s expedition, on 28 August 1608, William Hawkins had landed at Surat, the first commander of a company vessel to set foot on Indian soil. Hawkins, a bibulous sea dog, made his way to Agra, where he accepted a wife offered to him by the emperor, and brought her back to England. This was a version of history the House of Commons hanging committee chose to forget.

The rapid rise of the East India Company was made possible by the catastrophically rapid decline of the Mughals during the 18th century. As late as 1739, when Clive was only 14 years old, the Mughals still ruled a vast empire that stretched from Kabul to Madras. But in that year, the Persian adventurer Nadir Shah descended the Khyber Pass with 150,000 of his cavalry and defeated a Mughal army of 1.5 million men. Three months later, Nadir Shah returned to Persia carrying the pick of the treasures the Mughal empire had amassed in its 200 years of conquest: a caravan of riches that included Shah Jahan’s magnificent peacock throne, the Koh-i-Noor, the largest diamond in the world, as well as its “sister”, the Darya Nur, and “700 elephants, 4,000 camels and 12,000 horses carrying wagons all laden with gold, silver and precious stones”, worth an estimated £87.5m in the currency of the time. This haul was many times more valuable than that later extracted by Clive from the peripheral province of Bengal.

The destruction of Mughal power by Nadir Shah, and his removal of the funds that had financed it, quickly led to the disintegration of the empire. That same year, the French Compagnie des Indes began minting its own coins, and soon, without anyone to stop them, both the French and the English were drilling their own sepoys and militarising their operations. Before long the EIC was straddling the globe. Almost single-handedly, it reversed the balance of trade, which from Roman times on had led to a continual drain of western bullion eastwards. The EIC ferried opium to China, and in due course fought the opium wars in order to seize an offshore base at Hong Kong and safeguard its profitable monopoly in narcotics. To the west it shipped Chinese tea to Massachusetts, where its dumping in Boston harbour triggered the American war of independence.

By 1803, when the EIC captured the Mughal capital of Delhi, it had trained up a private security force of around 260,000- twice the size of the British army – and marshalled more firepower than any nation state in Asia. It was “an empire within an empire”, as one of its directors admitted. It had also by this stage created a vast and sophisticated administration and civil service, built much of London’s docklands and come close to generating nearly half of Britain’s trade. No wonder that the EIC now referred to itself as “the grandest society of merchants in the Universe”.

Yet, like more recent mega-corporations, the EIC proved at once hugely powerful and oddly vulnerable to economic uncertainty. Only seven years after the granting of the Diwani, when the company’s share price had doubled overnight after it acquired the wealth of the treasury of Bengal, the East India bubble burst after plunder and famine in Bengal led to massive shortfalls in expected land revenues. The EIC was left with debts of £1.5m and a bill of £1m unpaid tax owed to the Crown. When knowledge of this became public, 30 banks collapsed like dominoes across Europe, bringing trade to a standstill.

In a scene that seems horribly familiar to us today, this hyper-aggressive corporation had to come clean and ask for a massive government bailout. On 15 July 1772, the directors of the East India Company applied to the Bank of England for a loan of £400,000. A fortnight later, they returned, asking for an additional £300,000. The bank raised only £200,000. By August, the directors were whispering to the government that they would actually need an unprecedented sum of a further £1m. The official report the following year, written by Edmund Burke, foresaw that the EIC’s financial problems could potentially “like a mill-stone, drag [the government] down into an unfathomable abyss … This cursed Company would, at last, like a viper, be the destruction of the country which fostered it at its bosom.”


The East India Company really was too big to fail. So it was that in 1773 it was saved by history’s first mega-bailout

But unlike Lehman Brothers, the East India Company really was too big to fail. So it was that in 1773, the world’s first aggressive multinational corporation was saved by history’s first mega-bailout – the first example of a nation state extracting, as its price for saving a failing corporation, the right to regulate and severely rein it in.
***

In Allahabad, I hired a small dinghy from beneath the fort’s walls and asked the boatman to row me upstream. It was that beautiful moment, an hour before sunset, that north Indians call godhulibela – cow-dust time – and the Yamuna glittered in the evening light as brightly as any of the gems of Powis. Egrets picked their way along the banks, past pilgrims taking a dip near the auspicious point of confluence, where the Yamuna meets the Ganges. Ranks of little boys with fishing lines stood among the holy men and the pilgrims, engaged in the less mystical task of trying to hook catfish. Parakeets swooped out of cavities in the battlements, mynahs called to roost.

For 40 minutes we drifted slowly, the water gently lapping against the sides of the boat, past the mile-long succession of mighty towers and projecting bastions of the fort, each decorated with superb Mughal kiosks, lattices and finials. It seemed impossible that a single London corporation, however ruthless and aggressive, could have conquered an empire that was so magnificently strong, so confident in its own strength and brilliance and effortless sense of beauty.

Historians propose many reasons: the fracturing of Mughal India into tiny, competing states; the military edge that the industrial revolution had given the European powers. But perhaps most crucial was the support that the East India Company enjoyed from the British parliament. The relationship between them grew steadily more symbiotic throughout the 18th century. Returned nabobs like Clive used their wealth to buy both MPs and parliamentary seats – the famous Rotten Boroughs. In turn, parliament backed the company with state power: the ships and soldiers that were needed when the French and British East India Companies trained their guns on each other.

As I drifted on past the fort walls, I thought about the nexus between corporations and politicians in India today – which has delivered individual fortunes to rival those amassed by Clive and his fellow company directors. The country today has 6.9% of the world’s thousand or so billionaires, though its gross domestic product is only 2.1% of world GDP. The total wealth of India’s billionaires is equivalent to around 10% of the nation’s GDP – while the comparable ratio for China’s billionaires is less than 3%. More importantly, many of these fortunes have been created by manipulating state power – using political influence to secure rights to land and minerals, “flexibility” in regulation, and protection from foreign competition.

Multinationals still have villainous reputations in India, and with good reason; the many thousands of dead and injured in the Bhopal gas disaster of 1984 cannot be easily forgotten; the gas plant’s owner, the American multinational, Union Carbide, has managed to avoid prosecution or the payment of any meaningful compensation in the 30 years since. But the biggest Indian corporations, such as Reliance, Tata, DLF and Adani have shown themselves far more skilled than their foreign competitors in influencing Indian policymakers and the media. Reliance is now India’s biggest media company, as well as its biggest conglomerate; its owner, Mukesh Ambani, has unprecedented political access and power.

The last five years of India’s Congress party government were marked by a succession of corruption scandals that ranged from land and mineral giveaways to the corrupt sale of mobile phone spectrum at a fraction of its value. The consequent public disgust was the principal reason for the Congress party’s catastrophic defeat in the general election last May, though the country’s crony capitalists are unlikely to suffer as a result.

Estimated to have cost $4.9bn – perhaps the second most expensive ballot in democratic history after the US presidential election in 2012 – it brought Narendra Modi to power on a tidal wave of corporate donations. Exact figures are hard to come by, but Modi’s Bharatiya Janata party (BJP), is estimated to have spent at least $1bn on print and broadcast advertising alone. Of these donations, around 90% comes from unlisted corporate sources, given in return for who knows what undeclared promises of access and favours. The sheer strength of Modi’s new government means that those corporate backers may not be able to extract all they had hoped for, but there will certainly be rewards for the money donated.

In September, the governor of India’s central bank, Raghuram Rajan, made a speech in Mumbai expressing his anxieties about corporate money eroding the integrity of parliament: “Even as our democracy and our economy have become more vibrant,” he said, “an important issue in the recent election was whether we had substituted the crony socialism of the past with crony capitalism, where the rich and the influential are alleged to have received land, natural resources and spectrum in return for payoffs to venal politicians. By killing transparency and competition, crony capitalism is harmful to free enterprise, and economic growth. And by substituting special interests for the public interest, it is harmful to democratic expression.”

His anxieties were remarkably like those expressed in Britain more than 200 years earlier, when the East India Company had become synonymous with ostentatious wealth and political corruption: “What is England now?” fumed the Whig litterateur Horace Walpole, “A sink of Indian wealth.” In 1767 the company bought off parliamentary opposition by donating £400,000 to the Crown in return for its continued right to govern Bengal. But the anger against it finally reached ignition point on 13 February 1788, at the impeachment, for looting and corruption, of Clive’s successor as governor of Bengal, Warren Hastings. It was the nearest the British ever got to putting the EIC on trial, and they did so with one of their greatest orators at the helm – Edmund Burke.


FacebookTwitterPinterest Portraits of Nabobs, or representatives of the East India Company. Photograph: Alamy

Burke, leading the prosecution, railed against the way the returned company “nabobs” (or “nobs”, both corruptions of the Urdu word “Nawab”) were buying parliamentary influence, not just by bribing MPs to vote for their interests, but by corruptly using their Indian plunder to bribe their way into parliamentary office: “To-day the Commons of Great Britain prosecutes the delinquents of India,” thundered Burke, referring to the returned nabobs. “Tomorrow these delinquents of India may be the Commons of Great Britain.”

Burke thus correctly identified what remains today one of the great anxieties of modern liberal democracies: the ability of a ruthless corporation corruptly to buy a legislature. And just as corporations now recruit retired politicians in order to exploit their establishment contacts and use their influence, so did the East India Company. So it was, for example, that Lord Cornwallis, the man who oversaw the loss of the American colonies to Washington, was recruited by the EIC to oversee its Indian territories. As one observer wrote: “Of all human conditions, perhaps the most brilliant and at the same time the most anomalous, is that of the Governor General of British India. A private English gentleman, and the servant of a joint-stock company, during the brief period of his government he is the deputed sovereign of the greatest empire in the world; the ruler of a hundred million men; while dependant kings and princes bow down to him with a deferential awe and submission. There is nothing in history analogous to this position …”

Hastings survived his impeachment, but parliament did finally remove the EIC from power following the great Indian Uprising of 1857, some 90 years after the granting of the Diwani and 60 years after Hastings’s own trial. On 10 May 1857, the EIC’s own security forces rose up against their employer and on successfully crushing the insurgency, after nine uncertain months, the company distinguished itself for a final time by hanging and murdering tens of thousands of suspected rebels in the bazaar towns that lined the Ganges – probably the most bloody episode in the entire history of British colonialism.

Enough was enough. The same parliament that had done so much to enable the EIC to rise to unprecedented power, finally gobbled up its own baby. The British state, alerted to the dangers posed by corporate greed and incompetence, successfully tamed history’s most voracious corporation. In 1859, it was again within the walls of Allahabad Fort that the governor general, Lord Canning, formally announced that the company’s Indian possessions would be nationalised and pass into the control of the British Crown. Queen Victoria, rather than the directors of the EIC would henceforth be ruler of India.

The East India Company limped on in its amputated form for another 15 years, finally shutting down in 1874. Its brand name is now owned by a Gujarati businessman who uses it to sell “condiments and fine foods” from a showroom in London’s West End. Meanwhile, in a nice piece of historical and karmic symmetry, the current occupant of Powis Castle is married to a Bengali woman and photographs of a very Indian wedding were proudly on show in the Powis tearoom. This means that Clive’s descendants and inheritors will be half-Indian.
***

Today we are back to a world that would be familiar to Sir Thomas Roe, where the wealth of the west has begun again to drain eastwards, in the way it did from Roman times until the birth of the East India Company. When a British prime minister (or French president) visits India, he no longer comes as Clive did, to dictate terms. In fact, negotiation of any kind has passed from the agenda. Like Roe, he comes as a supplicant begging for business, and with him come the CEOs of his country’s biggest corporations.


The idea of the joint-stock company is arguably one of Britain’s most important exports to India

For the corporation – a revolutionary European invention contemporaneous with the beginnings of European colonialism, and which helped give Europe its competitive edge – has continued to thrive long after the collapse of European imperialism. When historians discuss the legacy of British colonialism in India, they usually mention democracy, the rule of law, railways, tea and cricket. Yet the idea of the joint-stock company is arguably one of Britain’s most important exports to India, and the one that has for better or worse changed South Asia as much any other European idea. Its influence certainly outweighs that of communism and Protestant Christianity, and possibly even that of democracy.

Companies and corporations now occupy the time and energy of more Indians than any institution other than the family. This should come as no surprise: as Ira Jackson, the former director of Harvard’s Centre for Business and Government, recently noted, corporations and their leaders have today “displaced politics and politicians as … the new high priests and oligarchs of our system”. Covertly, companies still govern the lives of a significant proportion of the human race.

The 300-year-old question of how to cope with the power and perils of large multinational corporations remains today without a clear answer: it is not clear how a nation state can adequately protect itself and its citizens from corporate excess. As the international subprime bubble and bank collapses of 2007-2009 have so recently demonstrated, just as corporations can shape the destiny of nations, they can also drag down their economies. In all, US and European banks lost more than $1tn on toxic assets from January 2007 to September 2009. What Burke feared the East India Company would do to England in 1772 actually happened to Iceland in 2008-11, when the systemic collapse of all three of the country’s major privately owned commercial banks brought the country to the brink of complete bankruptcy. A powerful corporation can still overwhelm or subvert a state every bit as effectively as the East India Company did in Bengal in 1765.

Corporate influence, with its fatal mix of power, money and unaccountability, is particularly potent and dangerous in frail states where corporations are insufficiently or ineffectually regulated, and where the purchasing power of a large company can outbid or overwhelm an underfunded government. This would seem to have been the case under the Congress government that ruled India until last year. Yet as we have seen in London, media organisations can still bend under the influence of corporations such as HSBC – while Sir Malcolm Rifkind’s boast about opening British embassies for the benefit of Chinese firms shows that the nexus between business and politics is as tight as it has ever been.

The East India Company no longer exists, and it has, thankfully, no exact modern equivalent. Walmart, which is the world’s largest corporation in revenue terms, does not number among its assets a fleet of nuclear submarines; neither Facebook nor Shell possesses regiments of infantry. Yet the East India Company – the first great multinational corporation, and the first to run amok – was the ultimate model for many of today’s joint-stock corporations. The most powerful among them do not need their own armies: they can rely on governments to protect their interests and bail them out. The East India Company remains history’s most terrifying warning about the potential for the abuse of corporate power – and the insidious means by which the interests of shareholders become those of the state. Three hundred and fifteen years after its founding, its story has never been more current.

Cricket’s great data debate: art v science

Andy Bull in The Guardian

In July 2007, after a history reckoned to stretch back almost 4,000 years, the game of draughts was finally solved. After two decades of work, a team of computer scientists at the University of Alberta finished sifting through the 500 billion, billion possible positions on the board. Their computer programme, Chinook, was now unbeatable. So long as neither player made a mistake, every game it played was guaranteed to end in a stalemate. Later that same summer, Peter Moores was appointed as head coach of the England cricket team. Moores was one of the new breed of coaches. A numbers man, and disciple of Michael Lewis’s much abused book, Moneyball. He even gave a copy to his batting coach, Andy Flower. Moores was so keen on advanced computer analysis that he used it as the sole basis for some of his decisions – the decision to recall Ryan Sidebottom to the side, for instance.

When Flower took over the team, he hired Nathan Leamon, a qualified coach and a former maths teacher, as the team’s analyst. The players nicknamed Leamon “Numbers”. He was extraordinarily meticulous. He used Hawk-Eye to draw up spreadsheets of every single ball delivered in Test cricket in the preceding five years. He ran match simulations – accurate to within 5% – to help England determine their strategies and their team selections. For the bowlers, he broke the pitch down into 20 blocks, each of them 100cm by 15cm, and told them which ones they should hit to best exploit the weaknesses Hawk-Eye had revealed in the opposing batsmen. Bowlers should aim to hit that particular block at least twice an over. Do that, Leamon told them, and they would “markedly increases the chance of success”.

England, it was said, were making better use of the computer analysis than any other team in the world. And it was working. They won the World T20, the Ashes home and away, and became, for a time, the No1 team in all three formats of the game. Leamon’s work was picked out as one of the reasons why. And yet now they’re losing, that very same approach is being singled out as one of the things they are doing wrong. You can see why. After England’s nine-wicket defeat to Sri Lanka, Eoin Morgan said “Going in at the halfway I think we got 310, probably 25 for both par, and again, stats back that up, par is 275, 280.” It was, Morgan thought, the bowlers who were to blame for the loss. They had delivered too many bad balls. He said he didn’t yet know why. “Over the next couple of days, we will get the Hawk-Eye stuff back and the proof will be in that.”

On Tuesday morning, Kevin Pietersen tweeted that England “are “too interested in stats”. He was echoing Graeme Swann’s comments from last summer. “I’ve sat in these meetings for the last five years,” Swann said. “It was a statistics-based game. There was this crazy stat where if we get 239 – this was before the fielding restrictions changed a bit so it would be more now, I assume – we will win 72% of matches. The whole game was built upon having this many runs after this many overs, this many partnerships, doing this in the middle, working at 4.5 an over.” Swann said he was left shaking his head.

Two respected players, both speaking from fresh first-hand experience, agree that England have become too reliant on computer analysis to tell them what to do. But balance that against the irritation old pros in all sports feel about big data. Just last week the great blowhard of the NBA Charles Barkley unleashed this tirade: “All these guys who run organisations who talk about analytics, they all have one thing in common – they’re a bunch of guys who have never played the game, and they never got the girls in high school, and they just want to get in the game.” Analytics, Barkley added, were “just some crap that some people who were really smart made up to try and get in the game”.

Barkley was shot down in flames. As Bryan Curtis summed it up in his wrap over on Grantland, commentators argued that Barkley’s rant was “unintelligible” and “wholly useless”, that he was a “dinosaur” who “didn’t even realise that the war is over”, and that “the nerds make the decisions”. In England though, where we’ve been slower to adopt analytics, the consensus seems to be that Swann and Pietersen are on to something. England’s over-reliance on the numbers has become a theme in the coverage of the team, particularly among ex-players. You can hear it when they bemoan, among other things, England’s reluctance to bowl yorkers at the stumps. That’s a tactic that has worked for years, one that has been honed by hard experience. But England’s analysis has told them that slow bouncers and full balls sent wide of off-stump are harder to score off.

The thing is, in an age when all teams are using computer analysis, a tactic isn’t good or bad because it looks that way, or because it is different to what has been done before. It is simply good if it works and bad if it doesn’t. The received wisdom is being challenged, and that’s a good thing. At the same time, cricket isn’t checkers. It can’t be solved by computer. It’s not a question of intuition versus analysis, or art v science, as David Hopps put it in a recent piece on Cricinfo. The laptop is just another tool in the box, useless unless the players understand the value of the information it provides, and no more valuable than their own ability to adapt and improvise during a match. If Swann and Pietersen are right, then England are wrong. At the same time, the lessons Leamon taught the team undoubtedly played a valuable part in their earlier success, something the sceptics seem to have forgotten.

Tuesday, 3 March 2015

The economic case for legalising cannabis


The public wants it and it would be good for the economy. Why has the law not been changed?

Paul Birch in The Telegraph

Channel 4’s Drugs Live programme promises to examine what cannabis does to the brain. Many of us have already seen the clips of Jon Snow struggling after a massive dose of high strength marijuana (the equivalent of forcing a teetotaller to down a bottle of vodka and then asking him how he feels).

But beyond the effects of cannabis on the brain, isn’t it time for a wider discussion on the potential effects of safe, regulated cannabis consumption on society?

How much is cannabis worth these days? According to the Institute for Economic and Research, up to £900m could be raised annually through taxation of regulated cannabis market.

Meanwhile £361 million is currently spent every year on policing and treating users of illegally traded and consumed cannabis.

It seems a lot to spend on punishing people for an activity most of us barely believe should be a crime any more. And that’s even before one factors in the potential benefit legalisation and regulation of cannabis could have for the UK exchequer.

Then, there is the job creation potential. In Colorado, which legalised marijuana at the beginning of 2014, 10,000 now work in the marijuana industry: growing and harvesting crops, working in dispensaries, and making and selling equipment. Crime has fallen: in the first three months after legalisation in Denver, the city experienced a 14.6 per cent drop in crime and specifically violent crime is down 2.4 per cent. Assaults were down by 3.7 per cent.

This reduction led to further savings and allowing stretched police forces to concentrate on more serious issues. Meanwhile, cannabis use by young people actually decreased, an uncomfortable fact for prohibitionists who argue that legalisation would simply encourage more teens to take up cannabis.

In an age when every penny of government spending is fought for, the demonstrated potential savings and revenues at very least deserve serious investigation. Revenue raised from a regulated cannabis trade could be directed towards education on safe use of cannabis.

That’s why the next government – regardless of who it is led by, should set up a Royal Commission into drug legislation.

Why a Royal Commission? Because I firmly believe this is a way forward for our fractured politics. A non-partisan commission can help politicians take hold of an issue and look at the evidence beyond the fears of being blindsided by attacks from the other side. Parties can agree to participate, evidence can be heard, everyday people can submit and read facts, opinions and analysis: it’s a real opportunity to create the “evidence-based policy” to which every party claims they aspire.

Major party leaders are reluctant to grasp the nettle of drug legislation. It’s understandable, given the current association of drugs with criminality. Half of people in the UK think cannabis contributes to street crime. But this association is inevitable as long as cannabis itself is illegal. Only a dispassionate discussion on the merits of cannabis legalisation and regulation can break that link.

Cista is standing for election on this issue because we believe the practical evidence has reached tipping point. Legalisation and regulation of cannabis can benefit the economy, lift the burden on the criminal justice system, encourage education about healthy, informed choices, and help recreational and medicinal cannabis users to enjoy a clean, safe product without being forced to engage with the underworld. Cannabis in itself is not the problem: our current law is. And we’re all paying the price.

What scares the new atheists: The vocal fervour of today’s missionary atheism conceals a panic that religion is not only refusing to decline – but in fact flourishing

John Gray in The Guardian

In 1929, the Thinker’s Library, a series established by the Rationalist Press Association to advance secular thinking and counter the influence of religion in Britain, published an English translation of the German biologist Ernst Haeckel’s 1899 book The Riddle of the Universe. Celebrated as “the German Darwin”, Haeckel was one of the most influential public intellectuals of the late nineteenth and early twentieth century; The Riddle of the Universe sold half a million copies in Germany alone, and was translated into dozens of other languages. Hostile to Jewish and Christian traditions, Haeckel devised his own “religion of science” called Monism, which incorporated an anthropology that divided the human species into a hierarchy of racial groups. Though he died in 1919, before the Nazi Party had been founded, his ideas, and widespread influence in Germany, unquestionably helped to create an intellectual climate in which policies of racial slavery and genocide were able to claim a basis in science.

The Thinker’s Library also featured works by Julian Huxley, grandson of TH Huxley, the Victorian biologist who was known as “Darwin’s bulldog” for his fierce defence of evolutionary theory. A proponent of “evolutionary humanism”, which he described as “religion without revelation”, Julian Huxley shared some of Haeckel’s views, including advocacy of eugenics. In 1931, Huxley wrote that there was “a certain amount of evidence that the negro is an earlier product of human evolution than the Mongolian or the European, and as such might be expected to have advanced less, both in body and mind”. Statements of this kind were then commonplace: there were many in the secular intelligentsia – including HG Wells, also a contributor to the Thinker’s Library – who looked forward to a time when “backward” peoples would be remade in a western mould or else vanish from the world.

But by the late 1930s, these views were becoming suspect: already in 1935, Huxley admitted that the concept of race was “hardly definable in scientific terms”. While he never renounced eugenics, little was heard from him on the subject after the second world war. The science that pronounced western people superior was bogus – but what shifted Huxley’s views wasn’t any scientific revelation: it was the rise of Nazism, which revealed what had been done under the aegis of Haeckel-style racism.

It has often been observed that Christianity follows changing moral fashions, all the while believing that it stands apart from the world. The same might be said, with more justice, of the prevalent version of atheism. If an earlier generation of unbelievers shared the racial prejudices of their time and elevated them to the status of scientific truths, evangelical atheists do the same with the liberal values to which western societies subscribe today – while looking with contempt upon “backward” cultures that have not abandoned religion. The racial theories promoted by atheists in the past have been consigned to the memory hole – and today’s most influential atheists would no more endorse racist biology than they would be seen following the guidance of an astrologer. But they have not renounced the conviction that human values must be based in science; now it is liberal values which receive that accolade. There are disputes, sometimes bitter, over how to define and interpret those values, but their supremacy is hardly ever questioned. For 21st century atheist missionaries, being liberal and scientific in outlook are one and the same.

It’s a reassuringly simple equation. In fact there are no reliable connections – whether in logic or history – between atheism, science and liberal values. When organised as a movement and backed by the power of the state, atheist ideologies have been an integral part of despotic regimes that also claimed to be based in science, such as the former Soviet Union. Many rival moralities and political systems – most of them, to date, illiberal – have attempted to assert a basis in science. All have been fraudulent and ephemeral. Yet the attempt continues in atheist movements today, which claim that liberal values can be scientifically validated and are therefore humanly universal.

Fortunately, this type of atheism isn’t the only one that has ever existed. There have been many modern atheisms, some of them more cogent and more intellectually liberating than the type that makes so much noise today. Campaigning atheism is a missionary enterprise, aiming to convert humankind to a particular version of unbelief; but not all atheists have been interested in propagating a new gospel, and some have been friendly to traditional faiths.

Evangelical atheists today view liberal values as part of an emerging global civilisation; but not all atheists, even when they have been committed liberals, have shared this comforting conviction. Atheism comes in many irreducibly different forms, among which the variety being promoted at the present time looks strikingly banal and parochial.

In itself, atheism is an entirely negative position. In pagan Rome, “atheist” (from the Greek atheos) meant anyone who refused to worship the established pantheon of deities. The term was applied to Christians, who not only refused to worship the gods of the pantheon but demanded exclusive worship of their own god. Many non-western religions contain no conception of a creator-god – Buddhism and Taoism, in some of their forms, are atheist religions of this kind – and many religions have had no interest in proselytising. In modern western contexts, however, atheism and rejection of monotheism are practically interchangeable. Roughly speaking, an atheist is anyone who has no use for the concept of God – the idea of a divine mind, which has created humankind and embodies in a perfect form the values that human beings cherish and strive to realise. Many who are atheists in this sense (including myself) regard the evangelical atheism that has emerged over the past few decades with bemusement. Why make a fuss over an idea that has no sense for you? There are untold multitudes who have no interest in waging war on beliefs that mean nothing to them. Throughout history, many have been happy to live their lives without bothering about ultimate questions. This sort of atheism is one of the perennial responses to the experience of being human.

As an organised movement, atheism is never non-committal in this way. It always goes with an alternative belief-system – typically, a set of ideas that serves to show the modern west is the high point of human development. In Europe from the late 19th century until the second world war, this was a version of evolutionary theory that marked out western peoples as being the most highly evolved. Around the time Haeckel was promoting his racial theories, a different theory of western superiority was developed by Marx. While condemning liberal societies and prophesying their doom, Marx viewed them as the high point of human development to date. (This is why he praised British colonialism in India as an essentially progressive development.) If Marx had serious reservations about Darwinism – and he did – it was because Darwin’s theory did not frame evolution as a progressive process.

The predominant varieties of atheist thinking, in the 19th and early 20th centuries, aimed to show that the secular west is the model for a universal civilisation. The missionary atheism of the present time is a replay of this theme; but the west is in retreat today, and beneath the fervour with which this atheism assaults religion there is an unmistakable mood of fear and anxiety. To a significant extent, the new atheism is the expression of a liberal moral panic.


FacebookTwitterPinterest Illustration by Christoph Hitz

Sam Harris, the American neuroscientist and author of The End of Faith: Religion, Terror and the Future of Reason (2004) and The Moral Landscape: How Science Can Determine Moral Values (2010), who was arguably the first of the “new atheists”, illustrates this point. Following many earlier atheist ideologues, he wants a “scientific morality”; but whereas earlier exponents of this sort of atheism used science to prop up values everyone would now agree were illiberal, Harris takes for granted that what he calls a “science of good and evil” cannot be other than liberal in content. (Not everyone will agree with Harris’s account of liberal values, which appears to sanction the practice of torture: “Given what many believe are the exigencies of our war on terrorism,” he wrote in 2004, “the practice of torture, in certain circumstances, would seem to be not only permissible but necessary.”)

Harris’s militancy in asserting these values seems to be largely a reaction to Islamist terrorism. For secular liberals of his generation, the shock of the 11 September attacks went beyond the atrocious loss of life they entailed. The effect of the attacks was to place a question mark over the belief that their values were spreading – slowly, and at times fitfully, but in the long run irresistibly – throughout the world. As society became ever more reliant on science, they had assumed, religion would inexorably decline. No doubt the process would be bumpy, and pockets of irrationality would linger on the margins of modern life; but religion would dwindle away as a factor in human conflict. The road would be long and winding. But the grand march of secular reason would continue, with more and more societies joining the modern west in marginalising religion. Someday, religious belief would be no more important than personal hobbies or ethnic cuisines.

Today, it’s clear that no grand march is under way. The rise of violent jihadism is only the most obvious example of a rejection of secular life. Jihadist thinking comes in numerous varieties, mixing strands from 20th century ideologies, such as Nazism and Leninism, with elements deriving from the 18th century Wahhabist Islamic fundamentalist movement. What all Islamist movements have in common is a categorical rejection of any secular realm. But the ongoing reversal in secularisation is not a peculiarly Islamic phenomenon.

The resurgence of religion is a worldwide development. Russian Orthodoxy is stronger than it has been for over a century, while China is the scene of a reawakening of its indigenous faiths and of underground movements that could make it the largest Christian country in the world by the end of this century. Despite tentative shifts in opinion that have been hailed as evidence it is becoming less pious, the US remains massively and pervasively religious – it’s inconceivable that a professed unbeliever could become president, for example.

For secular thinkers, the continuing vitality of religion calls into question the belief that history underpins their values. To be sure, there is disagreement as to the nature of these values. But pretty well all secular thinkers now take for granted that modern societies must in the end converge on some version of liberalism. Never well founded, this assumption is today clearly unreasonable. So, not for the first time, secular thinkers look to science for a foundation for their values.

It’s probably just as well that the current generation of atheists seems to know so little of the longer history of atheist movements. When they assert that science can bridge fact and value, they overlook the many incompatible value-systems that have been defended in this way. There is no more reason to think science can determine human values today than there was at the time of Haeckel or Huxley. None of the divergent values that atheists have from time to time promoted has any essential connection with atheism, or with science. How could any increase in scientific knowledge validate values such as human equality and personal autonomy? The source of these values is not science. In fact, as the most widely-read atheist thinker of all time argued, these quintessential liberal values have their origins in monotheism.

* * *

The new atheists rarely mention Friedrich Nietzsche, and when they do it is usually to dismiss him. This can’t be because Nietzsche’s ideas are said to have inspired the Nazi cult of racial inequality – an unlikely tale, given that the Nazis claimed their racism was based in science. The reason Nietzsche has been excluded from the mainstream of contemporary atheist thinking is that he exposed the problem atheism has with morality. It’s not that atheists can’t be moral – the subject of so many mawkish debates. The question is which morality an atheist should serve.

It’s a familiar question in continental Europe, where a number of thinkers have explored the prospects of a “difficult atheism” that doesn’t take liberal values for granted. It can’t be said that anything much has come from this effort. Georges Bataille’s postmodern project of “atheology” didn’t produce the godless religion he originally intended, or any coherent type of moral thinking. But at least Bataille, and other thinkers like him, understood that when monotheism has been left behind morality can’t go on as before. Among other things, the universal claims of liberal morality become highly questionable.


FacebookTwitterPinterest Illustration by Christoph Hitz

It’s impossible to read much contemporary polemic against religion without the impression that for the “new atheists” the world would be a better place if Jewish and Christian monotheism had never existed. If only the world wasn’t plagued by these troublesome God-botherers, they are always lamenting, liberal values would be so much more secure. Awkwardly for these atheists, Nietzsche understood that modern liberalism was a secular incarnation of these religious traditions. As a classical scholar, he recognised that a mystical Greek faith in reason had shaped the cultural matrix from which modern liberalism emerged. Some ancient Stoics defended the ideal of a cosmopolitan society; but this was based in the belief that humans share in the Logos, an immortal principle of rationality that was later absorbed into the conception of God with which we are familiar. Nietzsche was clear that the chief sources of liberalism were in Jewish and Christian theism: that is why he was so bitterly hostile to these religions. He was an atheist in large part because he rejected liberal values.

To be sure, evangelical unbelievers adamantly deny that liberalism needs any support from theism. If they are philosophers, they will wheel out their rusty intellectual equipment and assert that those who think liberalism relies on ideas and beliefs inherited from religion are guilty of a genetic fallacy. Canonical liberal thinkers such as John Locke and Immanuel Kant may have been steeped in theism; but ideas are not falsified because they originate in errors. The far-reaching claims these thinkers have made for liberal values can be detached from their theistic beginnings; a liberal morality that applies to all human beings can be formulated without any mention of religion. Or so we are continually being told. The trouble is that it’s hard to make any sense of the idea of a universal morality without invoking an understanding of what it is to be human that has been borrowed from theism. The belief that the human species is a moral agent struggling to realise its inherent possibilities – the narrative of redemption that sustains secular humanists everywhere – is a hollowed-out version of a theistic myth. The idea that the human species is striving to achieve any purpose or goal – a universal state of freedom or justice, say – presupposes a pre-Darwinian, teleological way of thinking that has no place in science. Empirically speaking, there is no such collective human agent, only different human beings with conflicting goals and values. If you think of morality in scientific terms, as part of the behaviour of the human animal, you find that humans don’t live according to iterations of a single universal code. Instead, they have fashioned many ways of life. A plurality of moralities is as natural for the human animal as the variety of languages.

At this point, the dread spectre of relativism tends to be raised. Doesn’t talk of plural moralities mean there can be no truth in ethics? Well, anyone who wants their values secured by something beyond the capricious human world had better join an old-fashioned religion. If you set aside any view of humankind that is borrowed from monotheism, you have to deal with human beings as you find them, with their perpetually warring values.

This isn’t the relativism celebrated by postmodernists, which holds that human values are merely cultural constructions. Humans are like other animals in having a definite nature, which shapes their experiences whether they like it or not. No one benefits from being tortured or persecuted on account of their religion or sexuality. Being chronically poor is rarely, if ever, a positive experience. Being at risk of violent death is bad for human beings whatever their culture. Such truisms could be multiplied. Universal human values can be understood as something like moral facts, marking out goods and evils that are generically human. Using these universal values, it may be possible to define a minimum standard of civilised life that every society should meet; but this minimum won’t be the liberal values of the present time turned into universal principles.

Universal values don’t add up to a universal morality. Such values are very often conflicting, and different societies resolve these conflicts in divergent ways. The Ottoman empire, during some of its history, was a haven of toleration for religious communities who were persecuted in Europe; but this pluralism did not extend to enabling individuals to move from one community to another, or to form new communities of choice, as would be required by a liberal ideal of personal autonomy. The Hapsburg empire was based on rejecting the liberal principle of national self-determination; but – possibly for that very reason – it was more protective of minorities than most of the states that succeeded it. Protecting universal values without honouring what are now seen as core liberal ideals, these archaic imperial regimes were more civilised than a great many states that exist today.

For many, regimes of this kind are imperfect examples of what all human beings secretly want – a world in which no one is unfree. The conviction that tyranny and persecution are aberrations in human affairs is at the heart of the liberal philosophy that prevails today. But this conviction is supported by faith more than evidence. Throughout history there have been large numbers who have been happy to relinquish their freedom as long as those they hate – gay people, Jews, immigrants and other minorities, for example – are deprived of freedom as well. Many have been ready to support tyranny and oppression. Billions of human beings have been hostile to liberal values, and there is no reason for thinking matters will be any different in future.

An older generation of liberal thinkers accepted this fact. As the late Stuart Hampshire put it:
“It is not only possible, but, on present evidence, probable that most conceptions of the good, and most ways of life, which are typical of commercial, liberal, industrialised societies will often seem altogether hateful to substantial minorities within these societies and even more hateful to most of the populations within traditional societies … As a liberal by philosophical conviction, I think I ought to expect to be hated, and to be found superficial and contemptible, by a large part of mankind.”

Today this a forbidden thought. How could all of humankind not want to be as we imagine ourselves to be? To suggest that large numbers hate and despise values such as toleration and personal autonomy is, for many people nowadays, an intolerable slur on the species. This is, in fact, the quintessential illusion of the ruling liberalism: the belief that all human beings are born freedom-loving and peaceful and become anything else only as a result of oppressive conditioning. But there is no hidden liberal struggling to escape from within the killers of the Islamic State and Boko Haram, any more than there was in the torturers who served the Pol Pot regime. To be sure, these are extreme cases. But in the larger sweep of history, faith-based violence and persecution, secular and religious, are hardly uncommon – and they have been widely supported. It is peaceful coexistence and the practice of toleration that are exceptional.
* * *

Considering the alternatives that are on offer, liberal societies are well worth defending. But there is no reason for thinking these societies are the beginning of a species-wide secular civilisation of the kind of which evangelical atheists dream.

In ancient Greece and Rome, religion was not separate from the rest of human activity. Christianity was less tolerant than these pagan societies, but without it the secular societies of modern times would hardly have been possible. By adopting the distinction between what is owed to Caesar and what to God, Paul and Augustine – who turned the teaching of Jesus into a universal creed – opened the way for societies in which religion was no longer coextensive with life. Secular regimes come in many shapes, some liberal, others tyrannical. Some aim for a separation of church and state as in the US and France, while others – such as the Ataturkist regime that until recently ruled in Turkey – assert state control over religion. Whatever its form, a secular state is no guarantee of a secular culture. Britain has an established church, but despite that fact – or more likely because of it – religion has a smaller role in politics than in America and is less publicly divisive than it is in France.
FacebookTwitterPinterest Illustration by Christoph Hitz

There is no sign anywhere of religion fading away, but by no means all atheists have thought the disappearance of religion possible or desirable. Some of the most prominent – including the early 19th-century poet and philosopherGiacomo Leopardi, the philosopher Arthur Schopenhauer, the Austro-Hungarian philosopher and novelist Fritz Mauthner (who published a four-volume history of atheism in the early 1920s) and Sigmund Freud, to name a few – were all atheists who accepted the human value of religion. One thing these atheists had in common was a refreshing indifference to questions of belief. Mauthner – who is remembered today chiefly because of a dismissive one-line mention in Wittgenstein’s Tractatus – suggested that belief and unbelief were both expressions of a superstitious faith in language. For him, “humanity” was an apparition which melts away along with the departing Deity. Atheism was an experiment in living without taking human concepts as realities. Intriguingly, Mauthner saw parallels between this radical atheism and the tradition of negative theology in which nothing can be affirmed of God, and described the heretical medieval Christian mystic Meister Eckhart as being an atheist in this sense.

Above all, these unevangelical atheists accepted that religion is definitively human. Though not all human beings may attach great importance to them, every society contains practices that are recognisably religious. Why should religion be universal in this way? For atheist missionaries this is a decidedly awkward question. Invariably they claim to be followers of Darwin. Yet they never ask what evolutionary function this species-wide phenomenon serves. There is an irresolvable contradiction between viewing religion naturalistically – as a human adaptation to living in the world – and condemning it as a tissue of error and illusion. What if the upshot of scientific inquiry is that a need for illusion is built into in the human mind? If religions are natural for humans and give value to their lives, why spend your life trying to persuade others to give them up?

The answer that will be given is that religion is implicated in many human evils. Of course this is true. Among other things, Christianity brought with it a type of sexual repression unknown in pagan times. Other religions have their own distinctive flaws. But the fault is not with religion, any more than science is to blame for the proliferation of weapons of mass destruction or medicine and psychology for the refinement of techniques of torture. The fault is in the intractable human animal. Like religion at its worst, contemporary atheism feeds the fantasy that human life can be remade by a conversion experience – in this case, conversion to unbelief.

Evangelical atheists at the present time are missionaries for their own values. If an earlier generation promoted the racial prejudices of their time as scientific truths, ours aims to give the illusions of contemporary liberalism a similar basis in science. It’s possible to envision different varieties of atheism developing – atheisms more like those of Freud, which didn’t replace God with a flattering image of humanity. But atheisms of this kind are unlikely to be popular. More than anything else, our unbelievers seek relief from the panic that grips them when they realise their values are rejected by much of humankind. What today’s freethinkers want is freedom from doubt, and the prevailing version of atheism is well suited to give it to them.