Search This Blog

Wednesday, 15 November 2017

The fatal flaw of neoliberalism: it's bad economics

Dani Rodrik in The Boston Globe

As even its harshest critics concede, neoliberalism is hard to pin down. In broad terms, it denotes a preference for markets over government, economic incentives over cultural norms, and private entrepreneurship over collective action. It has been used to describe a wide range of phenomena – from Augusto Pinochet to Margaret Thatcher and Ronald Reagan, from the Clinton Democrats and the UK’s New Labour to the economic opening in China and the reform of the welfare state in Sweden.

The term is used as a catchall for anything that smacks of deregulation, liberalisation, privatisation or fiscal austerity. Today it is routinely reviled as a shorthand for the ideas and practices that have produced growing economic insecurity and inequality, led to the loss of our political values and ideals, and even precipitated our current populist backlash.

We live in the age of neoliberalism, apparently. But who are neoliberalism’s adherents and disseminators – the neoliberals themselves? Oddly, you have to go back a long time to find anyone explicitly embracing neoliberalism. In 1982, Charles Peters, the longtime editor of the political magazine Washington Monthly, published an essay titled A Neo-Liberal’s Manifesto. It makes for interesting reading 35 years later, since the neoliberalism it describes bears little resemblance to today’s target of derision. The politicians Peters names as exemplifying the movement are not the likes of Thatcher and Reagan, but rather liberals – in the US sense of the word – who have become disillusioned with unions and big government and dropped their prejudices against markets and the military.

The use of the term “neoliberal” exploded in the 1990s, when it became closely associated with two developments, neither of which Peters’s article had mentioned. One of these was financial deregulation, which would culminate in the 2008 financial crash and in the still-lingering euro debacle. The second was economic globalisation, which accelerated thanks to free flows of finance and to a new, more ambitious type of trade agreement. Financialisation and globalisation have become the most overt manifestations of neoliberalism in today’s world.

That neoliberalism is a slippery, shifting concept, with no explicit lobby of defenders, does not mean that it is irrelevant or unreal. Who can deny that the world has experienced a decisive shift toward markets from the 1980s on? Or that centre-left politicians – Democrats in the US, socialists and social democrats in Europe – enthusiastically adopted some of the central creeds of Thatcherism and Reaganism, such as deregulation, privatisation, financial liberalisation and individual enterprise? Much of our contemporary policy discussion remains infused with principles supposedly grounded in the concept of homo economicus, the perfectly rational human being, found in many economic theories, who always pursues his own self-interest.

But the looseness of the term neoliberalism also means that criticism of it often misses the mark. There is nothing wrong with markets, private entrepreneurship or incentives – when deployed appropriately. Their creative use lies behind the most significant economic achievements of our time. As we heap scorn on neoliberalism, we risk throwing out some of neoliberalism’s useful ideas.

The real trouble is that mainstream economics shades too easily into ideology, constraining the choices that we appear to have and providing cookie-cutter solutions. A proper understanding of the economics that lie behind neoliberalism would allow us to identify – and to reject – ideology when it masquerades as economic science. Most importantly, it would help us to develop the institutional imagination we badly need to redesign capitalism for the 21st century.

Neoliberalism is typically understood as being based on key tenets of mainstream economic science. To see those tenets without the ideology, consider this thought experiment. A well-known and highly regarded economist lands in a country he has never visited and knows nothing about. He is brought to a meeting with the country’s leading policymakers. “Our country is in trouble,” they tell him. “The economy is stagnant, investment is low, and there is no growth in sight.” They turn to him expectantly: “Please tell us what we should do to make our economy grow.”

The economist pleads ignorance and explains that he knows too little about the country to make any recommendations. He would need to study the history of the economy, to analyse the statistics, and to travel around the country before he could say anything.


Tony Blair and Bill Clinton: centre-left politicians who enthusiastically adopted some of the central creeds of Thatcherism and Reaganism. Photograph: Reuters

But his hosts are insistent. “We understand your reticence, and we wish you had the time for all that,” they tell him. “But isn’t economics a science, and aren’t you one of its most distinguished practitioners? Even though you do not know much about our economy, surely there are some general theories and prescriptions you can share with us to guide our economic policies and reforms.”

The economist is now in a bind. He does not want to emulate those economic gurus he has long criticised for peddling their favourite policy advice. But he feels challenged by the question. Are there universal truths in economics? Can he say anything valid or useful?

So he begins. The efficiency with which an economy’s resources are allocated is a critical determinant of the economy’s performance, he says. Efficiency, in turn, requires aligning the incentives of households and businesses with social costs and benefits. The incentives faced by entrepreneurs, investors and producers are particularly important when it comes to economic growth. Growth needs a system of property rights and contract enforcement that will ensure those who invest can retain the returns on their investments. And the economy must be open to ideas and innovations from the rest of the world.

But economies can be derailed by macroeconomic instability, he goes on. Governments must therefore pursue a sound monetary policy, which means restricting the growth of liquidity to the increase in nominal money demand at reasonable inflation. They must ensure fiscal sustainability, so that the increase in public debt does not outpace national income. And they must carry out prudential regulation of banks and other financial institutions to prevent the financial system from taking excessive risk.

Now he is warming to his task. Economics is not just about efficiency and growth, he adds. Economic principles also carry over to equity and social policy. Economics has little to say about how much redistribution a society should seek. But it does tell us that the tax base should be as broad as possible, and that social programmes should be designed in a way that does not encourage workers to drop out of the labour market.

By the time the economist stops, it appears as if he has laid out a fully fledged neoliberal agenda. A critic in the audience will have heard all the code words: efficiency, incentives, property rights, sound money, fiscal prudence. And yet the universal principles that the economist describes are in fact quite open-ended. They presume a capitalist economy – one in which investment decisions are made by private individuals and firms – but not much beyond that. They allow for – indeed, they require – a surprising variety of institutional arrangements.

So has the economist just delivered a neoliberal screed? We would be mistaken to think so, and our mistake would consist of associating each abstract term – incentives, property rights, sound money – with a particular institutional counterpart. And therein lies the central conceit, and the fatal flaw, of neoliberalism: the belief that first-order economic principles map on to a unique set of policies, approximated by a Thatcher/Reagan-style agenda.

Consider property rights. They matter insofar as they allocate returns on investments. An optimal system would distribute property rights to those who would make the best use of an asset, and afford protection against those most likely to expropriate the returns. Property rights are good when they protect innovators from free riders, but they are bad when they protect them from competition. Depending on the context, a legal regime that provides the appropriate incentives can look quite different from the standard US-style regime of private property rights.

This may seem like a semantic point with little practical import; but China’s phenomenal economic success is largely due to its orthodoxy-defying institutional tinkering. China turned to markets, but did not copy western practices in property rights. Its reforms produced market-based incentives through a series of unusual institutional arrangements that were better adapted to the local context. Rather than move directly from state to private ownership, for example, which would have been stymied by the weakness of the prevailing legal structures, the country relied on mixed forms of ownership that provided more effective property rights for entrepreneurs in practice. Township and Village Enterprises (TVEs), which spearheaded Chinese economic growth during the 1980s, were collectives owned and controlled by local governments. Even though TVEs were publicly owned, entrepreneurs received the protection they needed against expropriation. Local governments had a direct stake in the profits of the firms, and hence did not want to kill the goose that lays the golden eggs.

China relied on a range of such innovations, each delivering the economist’s higher-order economic principles in unfamiliar institutional arrangements. For instance, it shielded its large state sector from global competition, establishing special economic zones where foreign firms could operate with different rules than in the rest of the economy. In view of such departures from orthodox blueprints, describing China’s economic reforms as neoliberal – as critics are inclined to do – distorts more than it reveals. If we are to call this neoliberalism, we must surely look more kindly on the ideas behind the most dramatic poverty reduction in history.

One might protest that China’s institutional innovations were purely transitional. Perhaps it will have to converge on western-style institutions to sustain its economic progress. But this common line of thinking overlooks the diversity of capitalist arrangements that still prevails among advanced economies, despite the considerable homogenisation of our policy discourse.

What, after all, are western institutions? The size of the public sector in OECD countries varies, from a third of the economy in Korea to nearly 60% in Finland. In Iceland, 86% of workers are members of a trade union; the comparable number in Switzerland is just 16%. In the US, firms can fire workers almost at will; French labour laws have historically required employers to jump through many hoops first. Stock markets have grown to a total value of nearly one-and-a-half times GDP in the US; in Germany, they are only a third as large, equivalent to just 50% of GDP.


‘China turned to markets, but did not copy western practices ... ’ Photograph: AFP/Getty

The idea that any one of these models of taxation, labour relations or financial organisation is inherently superior to the others is belied by the varying economic fortunes that each of these economies have experienced over recent decades. The US has gone through successive periods of angst in which its economic institutions were judged inferior to those in Germany, Japan, China, and now possibly Germany again. Certainly, comparable levels of wealth and productivity can be produced under very different models of capitalism. We might even go a step further: today’s prevailing models probably come nowhere near exhausting the range of what might be possible, and desirable, in the future.

The visiting economist in our thought experiment knows all this, and recognises that the principles he has enunciated need to be filled in with institutional detail before they become operational. Property rights? Yes, but how? Sound money? Of course, but how? It would perhaps be easier to criticise his list of principles for being vacuous than to denounce it as a neoliberal screed.

Still, these principles are not entirely content-free. China, and indeed all countries that managed to develop rapidly, demonstrate the utility of those principles once they are properly adapted to local context. Conversely, too many economies have been driven to ruin courtesy of political leaders who chose to violate them. We need look no further than Latin American populists or eastern European communist regimes to appreciate the practical significance of sound money, fiscal sustainability and private incentives.

Of course, economics goes beyond a list of abstract, largely common-sense principles. Much of the work of economists consists of developing stylised models of how economies work and then confronting those models with evidence. Economists tend to think of what they do as progressively refining their understanding of the world: their models are supposed to get better and better as they are tested and revised over time. But progress in economics happens differently.

Economists study a social reality that is unlike the physical universe. It is completely manmade, highly malleable and operates according to different rules across time and space. Economics advances not by settling on the right model or theory to answer such questions, but by improving our understanding of the diversity of causal relationships. Neoliberalism and its customary remedies – always more markets, always less government – are in fact a perversion of mainstream economics. Good economists know that the correct answer to any question in economics is: it depends.

Does an increase in the minimum wage depress employment? Yes, if the labour market is really competitive and employers have no control over the wage they must pay to attract workers; but not necessarily otherwise. Does trade liberalisation increase economic growth? Yes, if it increases the profitability of industries where the bulk of investment and innovation takes place; but not otherwise. Does more government spending increase employment? Yes, if there is slack in the economy and wages do not rise; but not otherwise. Does monopoly harm innovation? Yes and no, depending on a whole host of market circumstances.


‘Today [neoliberalism] is routinely reviled as a shorthand for the ideas that have produced growing economic inequality and precipitated our current populist backlash’ … Trump signing an order to take the US out of the TPP trade pact. Photograph: AFP/Getty

In economics, new models rarely supplant older models. The basic competitive-markets model dating back to Adam Smith has been modified over time by the inclusion, in rough historical order, of monopoly, externalities, scale economies, incomplete and asymmetric information, irrational behaviour and many other real-world features. But the older models remain as useful as ever. Understanding how real markets operate necessitates using different lenses at different times.

Perhaps maps offer the best analogy. Just like economic models, maps are highly stylised representations of reality. They are useful precisely because they abstract from many real-world details that would get in the way. But abstraction also implies that we need a different map depending on the nature of our journey. If we are travelling by bike, we need a map of bike trails. If we are to go on foot, we need a map of footpaths. If a new subway is constructed, we will need a subway map – but we wouldn’t throw out the older maps.

Economists tend to be very good at making maps, but not good enough at choosing the one most suited to the task at hand. When confronted with policy questions of the type our visiting economist faces, too many of them resort to “benchmark” models that favour the laissez-faire approach. Kneejerk solutions and hubris replace the richness and humility of the discussion in the seminar room. John Maynard Keynes once defined economics as the “science of thinking in terms of models, joined to the art of choosing models which are relevant”. Economists typically have trouble with the “art” part.

This, too, can be illustrated with a parable. A journalist calls an economics professor for his view on whether free trade is a good idea. The professor responds enthusiastically in the affirmative. The journalist then goes undercover as a student in the professor’s advanced graduate seminar on international trade. He poses the same question: is free trade good? This time the professor is stymied. “What do you mean by ‘good’?” he responds. “And good for whom?” The professor then launches into an extensive exegesis that will ultimately culminate in a heavily hedged statement: “So if the long list of conditions I have just described are satisfied, and assuming we can tax the beneficiaries to compensate the losers, freer trade has the potential to increase everyone’s wellbeing.” If he is in an expansive mood, the professor might add that the effect of free trade on an economy’s longterm growth rate is not clear either, and would depend on an altogether different set of requirements.

This professor is rather different from the one the journalist encountered previously. On the record, he exudes self-confidence, not reticence, about the appropriate policy. There is one and only one model, at least as far as the public conversation is concerned, and there is a single correct answer, regardless of context. Strangely, the professor deems the knowledge that he imparts to his advanced students to be inappropriate (or dangerous) for the general public. Why?

The roots of such behaviour lie deep in the culture of the economics profession. But one important motive is the zeal to display the profession’s crown jewels – market efficiency, the invisible hand, comparative advantage – in untarnished form, and to shield them from attack by self-interested barbarians, namely the protectionists. Unfortunately, these economists typically ignore the barbarians on the other side of the issue – financiers and multinational corporations whose motives are no purer and who are all too ready to hijack these ideas for their own benefit.

As a result, economists’ contributions to public debate are often biased in one direction, in favour of more trade, more finance and less government. That is why economists have developed a reputation as cheerleaders for neoliberalism, even if mainstream economics is very far from a paean to laissez-faire. The economists who let their enthusiasm for free markets run wild are in fact not being true to their own discipline.

How then should we think about globalisation in order to liberate it from the grip of neoliberal practices? We must begin by understanding the positive potential of global markets. Access to world markets in goods, technologies and capital has played an important role in virtually all of the economic miracles of our time. China is the most recent and powerful reminder of this historical truth, but it is not the only case. Before China, similar miracles were performed by South Korea, Taiwan, Japan and a few non-Asian countries such as Mauritius. All of these countries embraced globalisation rather than turn their backs on it, and they benefited handsomely.

Defenders of the existing economic order will quickly point to these examples when globalisation comes into question. What they will fail to say is that almost all of these countries joined the world economy by violating neoliberal strictures. South Korea and Taiwan, for instance, heavily subsidised their exporters, the former through the financial system and the latter through tax incentives. All of them eventually removed most of their import restrictions, long after economic growth had taken off.

But none, with the sole exception of Chile in the 1980s under Pinochet, followed the neoliberal recommendation of a rapid opening-up to imports. Chile’s neoliberal experiment eventually produced the worst economic crisis in all of Latin America. While the details differ across countries, in all cases governments played an active role in restructuring the economy and buffering it against a volatile external environment. Industrial policies, restrictions on capital flows and currency controls – all prohibited in the neoliberal playbook – were rampant.


Protest against Nafta in Mexico City in 2008: since the reforms of the mid-90s, the country’s economy has underperformed. Photograph: EPA

By contrast, countries that stuck closest to the neoliberal model of globalisation were sorely disappointed. Mexico provides a particularly sad example. Following a series of macroeconomic crises in the mid-1990s, Mexico embraced macroeconomic orthodoxy, extensively liberalised its economy, freed up the financial system, sharply reduced import restrictions and signed the North American Free Trade Agreement (Nafta). These policies did produce macroeconomic stability and a significant rise in foreign trade and internal investment. But where it counts – in overall productivity and economic growth – the experiment failed. Since undertaking the reforms, overall productivity in Mexico has stagnated, and the economy has underperformed even by the undemanding standards of Latin America.

These outcomes are not a surprise from the perspective of sound economics. They are yet another manifestation of the need for economic policies to be attuned to the failures to which markets are prone, and to be tailored to the specific circumstances of each country. No single blueprint fits all.

As Peters’s 1982 manifesto attests, the meaning of neoliberalism has changed considerably over time as the label has acquired harder-line connotations with respect to deregulation, financialisation and globalisation. But there is one thread that connects all versions of neoliberalism, and that is the emphasis on economic growth. Peters wrote in 1982 that the emphasis was warranted because growth is essential to all our social and political ends – community, democracy, prosperity. Entrepreneurship, private investment and removing obstacles that stand in the way (such as excessive regulation) were all instruments for achieving economic growth. If a similar neoliberal manifesto were penned today, it would no doubt make the same point.

Critics often point out that this emphasis on economics debases and sacrifices other important values such as equality, social inclusion, democratic deliberation and justice. Those political and social objectives obviously matter enormously, and in some contexts they matter the most. They cannot always, or even often, be achieved by means of technocratic economic policies; politics must play a central role.

Still, neoliberals are not wrong when they argue that our most cherished ideals are more likely to be attained when our economy is vibrant, strong and growing. Where they are wrong is in believing that there is a unique and universal recipe for improving economic performance, to which they have access. The fatal flaw of neoliberalism is that it does not even get the economics right. It must be rejected on its own terms for the simple reason that it is bad economics.

Monday, 13 November 2017

Africa has been failed by westernisation. It must cast off its subservience

Chigozie Obioma in The Guardian






One of the greatest ironies in the history of the collapse of any civilisation must be the initial interaction between Africans and Europeans. The Igbos in the east of Nigeria, for instance, initially saw the Europeans as madmen of strange appearance and ill-formed ideologies. On banking, the Igbos wondered how an adult in his right mind could hand over his possessions for others to keep for him. By the end of the 19th century, the “madman” had overturned their civilisation, and they had adopted his.

The irony is especially relevant in these times when, given the relative failures of most former western colonies, there have been renewed calls for recolonialisation. In September, American professor Bruce Gilley wrote an essay arguing for a recolonialisation of some states, replicating colonial governance of the past “as far as possible” and even building new colonies from scratch.

If the very foundations of his arguments are flawed, it is because he, like most people today, has come to accept that the only metric for measuring modernity is through the western lens. This is the heart of the problem.

Colonialism across most of Africa was so thorough – especially among the former British protectorates – that in its aftermath Africa was essentially hollowed out. The civilisations of the peoples, their various cultures and traditions, their religions, political philosophies and institutions, were eroded or even destroyed.

Today most of the nations in Africa should not even be called African nations, but western African nations. The language, political ideology, socio-economic structures, education, and everything that makes up a nation, even down to popular culture, do not originate from within these countries. African nations have a total dependency on foreign political philosophies and ideas, and their shifts and movements.

It is the feeblest position a state and its people can be in, because it is a position of chronic subservience. It also means that whatever becomes normalised in the west will eventually be adopted in, say, Uganda or Togo. 

This has resulted in Africa being slowly emptied of its essence, and becoming a relic, no different in substance from a statue or a museum.

Celebrations of Africa on the international scene mostly involve dancing, music, traditional fashion and other cultural artefacts – hardly ever showcasing African-originated economic ideas, social ideologies or intellectual theories. It is not that these do not exist, but the world has successfully convinced everyone – including Africans themselves – that everything African is inferior.

Central to this psychology is the proliferation of Africans being educated in the west. This trend has resulted in the rise of an army of western-influenced elites who continue the colonialism of their own people.
Imagine what can happen when an African nation with a high unemployment rate imbibes a gun culture. Consider the potential danger of a situation in Nigeria, where the Hausa man insists his culture is being appropriated by the Yoruba. Or the Christian Igbo embracing their identity, recruiting allies, and ostracising anyone who will not acquiesce with their cause.

But this is becoming Africa’s reality. Increasingly, our elites tell us that the way of the west is “modern” and “civilised”, echoing the early colonialists who dismissed our civilisations as “barbaric”, “archaic”, and “uncivilised” to install theirs. They tell us that our institutions are corrupt, that our societies are patriarchal, and that the African traditional religions are heathenish. As western supremacy entrenches itself in our psyche, we are developing a complex that embraces western ideas without considering whether or not they are compatible with our own political, social, economic and cultural system.

Although Americans may be rightly calling for “diversity”, given a history that excluded a major demographic population of black people, Nigeria’s struggle from inception has been how to unify its enormous diversity. It was the lack of that unity that resulted in the civil war of the late 1960s. This is the same for Angola, Rwanda and Uganda, to name just a few.

But this is of little concern to Africa’s elites. What matters is to find what the political currency is in the US or Europe, and to uncritically follow it. Whereas people in the west are de-emphasising patriotism and nationalism, Africans need these to build sustainable nations.



  ‘The most viable pathway would be for Africa’s elite to look within the vast political and ideological resources on which successful civilisations were built.’ Timbuktu in Mali. Photograph: Sean Smith for the Guardian
In fact, the lack of them, in favour of ethnic allegiance, has been the bane of most African nations, from Congo to Somalia: the result of the Berlin Conference of 1884, in which European leaders divvied up African territories among themselves, ignoring traditional ethnic borders.

In making a case about east Asia and citing the success of Singapore, Taiwan, Hong Kong and others, Chinese philosopher Zou Shipeng argued that the west’s claim to its culture as the only pathway to modernity creates unfair hegemony. Is it possible, he asked, to achieve modernity solely through Chinese culture?

The Middle Eastern nations are another example of cultures that have accepted material modernity but have not been westernised ideologically. They have retained their political systems which, given their theocratic cultural framework, seem best suited for these countries. Every time western nations have tried to disrupt those systems and install a western-style democracy, it has failed.

This was also the case with the first two centuries of European contact with west Africans. The Portuguese and the Dutch traded with many west African tribes from the mid-15th century without colonising them. For 200 years, the Igbos had “Dane guns”, mirrors, and gins, among others, but held on to their own traditions and cultures. It could therefore be argued that the “modernism” orchestrated by western colonialism, isn’t organic to the Africans. Yet proponents of recolonialisation and the African elites fail to see this.

With the sudden unexpected rise in rightwing populism across the west, it is challenging to decide what a viable future may look like. One would think African nations would take this opportunity to think for themselves, to come up with unique African systems.

However, rather than do this, the African elite class largely insists that Africa is not western enough, and is trying to drag the continent, still grappling with western modernism, into the west’s evolving postmodernist regime.

The most viable pathway would be for Africa’s elite to look within the vast political and ideological resources on which successful civilisations (the Zulu, the Igbo, the Malian dynasties of Timbuktu, the Oyo empire, etc) were built. In most Igbo states, for instance, there was an egalitarian system where an older member of a clan represented his people in the elders’ council. There were no kings or presidents. Perhaps there could be a way to adapt this unique political structure to replace the western one which has so far failed.

We need to look into these systems and extract coherent policies that can help form workable and uniquely African social and political systems. This is the only viable path to preventing the continent from fully becoming western Africa – and the only way to ending the continent’s long-term political decay.

For many, free movement causes more pain – and Brexit seems to be the cure

Deborah Orr in The Guardian


The last 16 months have made one thing clear: it’s much easier to vote to leave the EU than it is to actually leave. Remainers such as myself now find it tempting to say: “I told you so.” This, broadly speaking, is because we’re a bunch of smug know-it-alls, who haven’t even properly asked ourselves why we failed so badly to get our point across last June.

The answer, of course, is that we were and are too busy being smug know-it-alls: certain before the referendum that the idealism of the EU was plain for everyone except the terminally thick and racist to see; and certain afterwards that surely at some point even the terminally thick and racist will start having buyers’ remorse.

The sheer tragicomedy of EU-UK negotiations is indeed getting some people so fed up with the whole farrago that a few Brexiteers are crossing the floor. But, mostly, people view the difficulty of leaving the EU as yet more proof that it’s a money-grabbing, navel-gazing, inert and self-serving bureaucracy, as respectful of democracy as Kim Jong-un and as responsive to the needs of actual people as a gigantic mudslide. An in-depth survey of Brexiteers in Wales last month confirmed pretty much exactly that.

Politicians do understand, on the whole, that the factor above all others that motivates white working-class Brexit voters is free movement, as again the Welsh survey attests. This is why Labour in particular is hamstrung. Backing remain would please its Guardian-reading supporters. But that would alienate many of its core voters. Whatever Jeremy Corbyn’s own views about the EU, the sensible strategy for the short-term is not to seem at all remain-oriented.

Short-term being the operative word. The big trouble with the idealism of free movement is that its intellectual underpinnings demand pain now for future gain. The idea is that people will crisscross the various member countries, working where there’s work to create economic growth, returning home with money, experience and ideas, to start businesses that will attract others in turn, until every country is as prosperous as its neighbour.

This transformation, if it happens, will take generations. But the architects of this grand plan – the experts, the economists, the “elite” – are not the people who feel any short-term pain.

It’s completely unrealistic to ask people to spend their lives wondering where the money is coming from to pay the next electricity bill, whether their children will ever get out of their expensive private accommodation, and whether their grandchildren will be on zero-hours contracts forever, all so that maybe 80 years from now the average living standard of a Lithuanian will be similar to that of a Welshman.

People need their lives to improve now, not to live in stress and worry because things might work out in the future. The theoretical utopians who support the EU are not those who are expected to feel solidarity with their Polish colleagues in the salad-bagging factory. Especially when those colleagues are working towards a different goal. It’s easier to work for low wages if these wages are higher than you would be getting back home; easier to save when you know that a deposit on a house back home with your family is an achievable goal; easier to go without when you know that it’s for a finite time.

Where in the EU do young, unskilled British people head to get such a start in life? Reciprocity doesn’t exist.

In the 1980s, builders went to Germany, as dramatised in Auf Wiedersehen, Pet. In Germany today, builders come from eastern Europe. Wealthy countries in the EU are rightly expected to be generous. But when your own country has not generously shared its wealth with you, it’s hard to accept that you’re the ones expected to carry the burden in this grand new wealth-sharing concept.

In his book Austerity Britain, historian David Kynaston quoted evidence from the Mass Observation project that the people who lived in the areas most devastated by the war were far less likely to be optimistic about the future than those who had got off lightly. Part of their ennui was the knowledge that change had been promised after the first world war, yet hadn’t come about.

The same goes for the areas that were economically devastated in the early stages of globalisation. The EU didn’t save them then; it isn’t saving them now. No amount of promises that the EU is the best hope of shelter from economic change in the future will persuade enough of the hard-up Brexiteers in that 52% vote.

If progressives want to change the minds of Brexiteers, waiting for them to see the error of their ways isn’t going to work. What people need is a quid pro quo that offers them tangible improvements in their lives right now. That, and only that, will keep Britain in the EU.

Sunday, 12 November 2017

Standup Comedy


Nudging can also be used for dark purposes

Tim Harford in The FT

“If you want people to do the right thing, make it easy.” That is the simplest possible summary of Nudge by Cass Sunstein and Richard Thaler. We are all fallible creatures, and so benevolent policymakers need to make sure that the path of least resistance goes to a happy destination. It is a simple but important idea, and deservedly influential: Mr Sunstein became a senior adviser to President Obama, while Mr Thaler is this year’s winner of the Nobel memorial prize in economics. 

Policy wonks have nudged people to sign up for organ donation, to increase their pension contributions — and even insulate their homes by coupling home insulation with an attic-decluttering service. All we have to do is make it easy for people to do the right thing. 

But what if you want people to do the wrong thing? The answer: make that easy; or make the right thing difficult. Messrs Thaler and Sunstein are well aware of the risk of malign nudges, and have been searching for the right word to describe them. Mr Thaler likes “sludge” — obfuscatory language or procedures that accidentally or deliberately encourage inertia. Voter ID laws, he says, are a good example of sludge, calculated to softly disenfranchise. Meanwhile Mr Sunstein has written an entire book about the “ethics of influence”. 

And as we are starting to realise, Vladimir Putin is well aware of the opportunity that behavioural science presents, too. Rumours circulate that the Russian authorities are keen recruiters of young psychologists and behavioural economists; I have no proof of that, but it seems like a reasonable thing for the Russian government to do. I am willing to bet that not all of them are working on attic-decluttering. 

According to Richard Burr, chair of the US Senate intelligence committee, Russian troll accounts on Facebook managed to organise both a protest and a counter-protest in Houston, in May 2016. Americans are perfectly willing to face off against each other on the streets, but if you want it to happen more often, make it easy. 

A number of other memes, political advertisements and provocateur accounts — both left and rightwing — have since been identified as of Russian origin. Social media networks have unwittingly sold them air time; news sites have cited them; people have shared them, or spent effort refuting them. Nudge isn’t the word for this, but neither is sludge. What about “grudge”? 

The Russians are not alone in using grudge theory to manipulate public opinion. Three social scientists — Gary King, Jennifer Pan and Margaret Roberts — recently managed to infiltrate networks of shills in China, who are paid to post helpful messages on Chinese social media. (Their nickname is the “50 cent army”.) Unlike the Russian trolls, their aim has been to avoid engaging “in debate or argument of any kind . . . they seem to avoid controversial issues entirely”. The tactic is, rather, to keep changing the subject, especially at politically sensitive moments, by talking about the weather, sports — anything. If you want potential protesters to make cheery small talk instead, make it easy. 

Just as noble tools can be turned to wicked ends, so shady techniques can be used to do the work of the angels. For example, why not disrupt online markets for illegal drugs by leaving bad reviews for vendors? Research by social scientists Scott Duxbury and Dana Haynie suggests that because people rely on user reviews on illicit markets, law enforcement officers could attack those markets by faking negative reviews, thus undermining trust. 

The parallel with Mr Putin is alarmingly clear: it is possible to attack democracy and rational discourse by creating an information ecosystem where everyone yells at everyone else and nobody believes anything. 

But we should not give too much credit to Mr Putin. He did not create the information ecosystem of the western world; we did. The Russians just gave us a push, and probably not a very big push at that. Perhaps I should say they gave us a nudge. 

Social media do seem vulnerable to dark nudges from foreign powers. But more worrying is our vulnerability to smears, skews and superficiality without any outside intervention at all. Messrs Sunstein and Thaler ask policymakers to make it easy to do the right thing; what have we made it easy to do? 

It is easy to find a like-minded tribe. It is easy to share, retweet or “like” something we have not even read. It is easy to repeat false claims. It is easy to get angry or personal. 

It’s less easy to distinguish truth from lies, to clear time and attention to read something deep, and to reward an important article with something more than a digital thumbs up. But then, none of this is fundamental to the business model of many media companies — or of the social media networks that spread the news. 

Nudge, sludge or grudge, we can change this. And we should start by asking ourselves whether when it comes to news, information and debate, we have made it difficult to do the right thing — and all too easy to stray.

Outrage and opposition are not the same

Tabish Khair in The Hindu



I cannot say this online, I am sure, but I do not believe in getting publicly outraged. This does not mean that I do not feel privately outraged at times. I do. When one hears of a woman being raped, one feels outraged. When one hears of the most powerful man on earth reportedly discussing nuclear war options, one feels outraged.

And yet, it is one thing to feel outraged and another to act from outrage or even cultivate that self-righteous feeling of outrage. Because outrage is not opposition. Actually, it is not even rage. It is an ‘outing’ of rage.

Rage versus outrage

Rage is a problematic word: its etymology connects it to madness, violence, passion and fierceness in battle. Its uses, if they can be justified, are hazardous, and pertain to extreme circumstances. In Greek mythology, the consequences of human or semi-divine rage tend to be disastrous, even when the act of rage is seen as justified. However, rage has one purpose in extreme circumstances: it can get things done.

Outrage is not like rage: it is a venting of rage. When we are outraged, we basically let off steam. This is more so online. Its primary purpose is to make us feel good about ourselves. Unlike rage, it might not even get anything done. Because once we get outraged and post a few things or espouse a list, our attention wavers, and soon we have another matter to get publicly outraged about.

Like rage, outrage often leads to hasty action. In India as well as in Europe, people got outraged at the rumour of some women putting spells on their cattle or their person, and proceeded to burn the women as witches. Racists in the American south are known to have become outraged at some real or imagined slight by African Americans and lynched them. The list of innocent people persecuted, killed, burned, or lynched because otherwise decent people got publicly outraged is pretty long.

Unfortunately, outrage is particularly adaptable to online culture, where the dominant ethos is that of self-indulgence rather than an engagement with the other. By getting outraged, we signal to ourselves and others that we have the right views. We might also, by the very level of our outrage, absolve ourselves from a close examination of the matter and an organised effort (with others) to tackle the matter. Outrages tend to lead to nothing at all — or to witch-hunts.
By moving on from one outrage to another, we might also make it more difficult to address the root causes of the injustice, if it exists, behind our outrage. Outrage is expressive, reactive, wordy, fleeting. Opposition requires physical action, thought, organisation and perseverance. It is a major mistake to confuse the two.

Opposition needs a considered evaluation of evidence and possibilities; outrage tends towards self-centred and sweeping pre-judgment, usually passed without deep thought to the matter or comprehensive collection of evidence. It is worth remarking that ‘prejudice’ basically means ‘prejudgment,’ from the Latin words prae and judicium.

The general flow of outrage is towards a kind of fascist violence: it assumes guilt unless the victim is proved innocent, and moves too fast for sufficient proof to be collected. Opposition is a democratic construct: it accepts that you are innocent unless proved guilty.

Dismissing opposition

Unfortunately, given our hyperventilating cybercultures, outrage has become synonymous with opposition. Apart from the problems outlined above, this has another serious drawback: in an atmosphere of frequent outrages, it is possible to dismiss legitimate opposition as outrage. This, as we know from places like India, Turkey and the U.S., is the usual policy of the parties in power.

Because all opposition is increasingly wrapped in verbal and digital forms of outrage, this is easy for people in power to do. Online postings, TV shows, etc. consistently assume the registers and pace of outrages, so that the pith of the matter is often lost in the smoke, and even necessary acts of opposition can be dismissed as just the hyperventilation of easily outraged groups.

It is sad that this has happened even in India, where Gandhiji set a very rigorous example of calm and collected opposition, even, I would say, a slow and forbearing opposition. He knew that any true opposition — he would have called it a just opposition — needs thought, time, slowness and perseverance. These are not characteristics that outrage respects.

I find it troublesome that we have entered a phase of public discourse where, on the one hand, outrages erupt one after another and then evaporate in the desert sands of usual practice, and where, on the other hand, genuine acts of opposition are dismissed by people in power as just fleeting outrages.

On the one side, there are people yelling at us to be outraged, without considering evidence, context or effective responses, and on the other side, there are people telling us that we are just acting outraged when actually we are opposing something that needs to be opposed. How does one negotiate a public space like that? Your answer is as good as mine. But I think slowing down just a bit before passing judgment and looking more deeply at matters might not be such bad ideas.

Saturday, 11 November 2017

How colonial violence came home: the ugly truth of the first world war

Pankaj Mishra in The Guardian



Today on the Western Front,” the German sociologist Max Weber wrote in September 1917, there “stands a dross of African and Asiatic savages and all the world’s rabble of thieves and lumpens.” Weber was referring to the millions of Indian, African, Arab, Chinese and Vietnamese soldiers and labourers, who were then fighting with British and French forces in Europe, as well as in several ancillary theatres of the first world war.

Faced with manpower shortages, British imperialists had recruited up to 1.4 million Indian soldiers. France enlisted nearly 500,000 troops from its colonies in Africa and Indochina. Nearly 400,000 African Americans were also inducted into US forces. The first world war’s truly unknown soldiers are these non-white combatants. 

Ho Chi Minh, who spent much of the war in Europe, denounced what he saw as the press-ganging of subordinate peoples. Before the start of the Great War, Ho wrote, they were seen as “nothing but dirty Negroes … good for no more than pulling rickshaws”. But when Europe’s slaughter machines needed “human fodder”, they were called into service. Other anti-imperialists, such as Mohandas Gandhi and WEB Du Bois, vigorously supported the war aims of their white overlords, hoping to secure dignity for their compatriots in the aftermath. But they did not realise what Weber’s remarks revealed: that Europeans had quickly come to fear and hate physical proximity to their non-white subjects – their “new-caught sullen peoples”, as Kipling called colonised Asians and Africans in his 1899 poem The White Man’s Burden.

These colonial subjects remain marginal in popular histories of the war. They also go largely uncommemorated by the hallowed rituals of Remembrance Day. The ceremonial walk to the Cenotaph at Whitehall by all major British dignitaries, the two minutes of silence broken by the Last Post, the laying of poppy wreaths and the singing of the national anthem – all of these uphold the first world war as Europe’s stupendous act of self-harm. For the past century, the war has been remembered as a great rupture in modern western civilisation, an inexplicable catastrophe that highly civilised European powers sleepwalked into after the “long peace” of the 19th century – a catastrophe whose unresolved issues provoked yet another calamitous conflict between liberal democracy and authoritarianism, in which the former finally triumphed, returning Europe to its proper equilibrium.

With more than eight million dead and more than 21 million wounded, the war was the bloodiest in European history until that second conflagration on the continent ended in 1945. War memorials in Europe’s remotest villages, as well as the cemeteries of Verdun, the Marne, Passchendaele, and the Somme enshrine a heartbreakingly extensive experience of bereavement. In many books and films, the prewar years appear as an age of prosperity and contentment in Europe, with the summer of 1913 featuring as the last golden summer.

But today, as racism and xenophobia return to the centre of western politics, it is time to remember that the background to the first world war was decades of racist imperialism whose consequences still endure. It is something that is not remembered much, if at all, on Remembrance Day.
At the time of the first world war, all western powers upheld a racial hierarchy built around a shared project of territorial expansion. In 1917, the US president, Woodrow Wilson, baldly stated his intention, “to keep the white race strong against the yellow” and to preserve “white civilisation and its domination of the planet”. Eugenicist ideas of racial selection were everywhere in the mainstream, and the anxiety expressed in papers like the Daily Mail, which worried about white women coming into contact with “natives who are worse than brutes when their passions are aroused”, was widely shared across the west. Anti-miscegenation laws existed in most US states. In the years leading up to 1914, prohibitions on sexual relations between European women and black men (though not between European men and African women) were enforced across European colonies in Africa. The presence of the “dirty Negroes” in Europe after 1914 seemed to be violating a firm taboo.

Injured Indian soldiers being cared for by the Red Cross in England in March 1915. Photograph: De Agostini Picture Library/Biblioteca Ambrosiana

In May 1915, a scandal erupted when the Daily Mail printed a photograph of a British nurse standing behind a wounded Indian soldier. Army officials tried to withdraw white nurses from hospitals treating Indians, and disbarred the latter from leaving the hospital premises without a white male companion. The outrage when France deployed soldiers from Africa (a majority of them from the Maghreb) in its postwar occupation of Germany was particularly intense and more widespread. Germany had also fielded thousands of African soldiers while trying to hold on to its colonies in east Africa, but it had not used them in Europe, or indulged in what the German foreign minister (and former governor of Samoa), Wilhelm Solf, called “racially shameful use of coloureds”.

“These savages are a terrible danger,” a joint declaration of the German national assembly warned in 1920, to “German women”. Writing Mein Kampf in the 1920s, Adolf Hitler would describe African soldiers on German soil as a Jewish conspiracy aimed to topple white people “from their cultural and political heights”. The Nazis, who were inspired by American innovations in racial hygiene, would in 1937 forcibly sterilise hundreds of children fathered by African soldiers. Fear and hatred of armed “niggers” (as Weber called them) on German soil was not confined to Germany, or the political right. The pope protested against their presence, and an editorial in the Daily Herald, a British socialist newspaper, in 1920 was titled “Black Scourge in Europe”.

This was the prevailing global racial order, built around an exclusionary notion of whiteness and buttressed by imperialism, pseudo-science and the ideology of social Darwinism. In our own time, the steady erosion of the inherited privileges of race has destabilised western identities and institutions – and it has unveiled racism as an enduringly potent political force, empowering volatile demagogues in the heart of the modern west.

Today, as white supremacists feverishly build transnational alliances, it becomes imperative to ask, as Du Bois did in 1910: “What is whiteness that one should so desire it?” As we remember the first global war, it must be remembered against the background of a project of western global domination – one that was shared by all of the war’s major antagonists. The first world war, in fact, marked the moment when the violent legacies of imperialism in Asia and Africa returned home, exploding into self-destructive carnage in Europe. And it seems ominously significant on this particular Remembrance Day: the potential for large-scale mayhem in the west today is greater than at any other time in its long peace since 1945.

When historians discuss the origins of the Great War, they usually focus on rigid alliances, military timetables, imperialist rivalries, arms races and German militarism. The war, they repeatedly tell us, was the seminal calamity of the 20th century – Europe’s original sin, which enabled even bigger eruptions of savagery such as the second world war and the Holocaust. An extensive literature on the war, literally tens of thousands of books and scholarly articles, largely dwells on the western front and the impact of the mutual butchery on Britain, France, and Germany – and significantly, on the metropolitan cores of these imperial powers rather than their peripheries. In this orthodox narrative, which is punctuated by the Russian Revolution and the Balfour declaration in 1917, the war begins with the “guns of August” in 1914, and exultantly patriotic crowds across Europe send soldiers off to a bloody stalemate in the trenches. Peace arrives with the Armistice of 11 November 1918, only to be tragically compromised by the Treaty of Versailles in 1919, which sets the stage for another world war.

In one predominant but highly ideological version of European history – popularised since the cold war – the world wars, together with fascism and communism, are simply monstrous aberrations in the universal advance of liberal democracy and freedom. In many ways, however, it is the decades after 1945 – when Europe, deprived of its colonies, emerged from the ruins of two cataclysmic wars – that increasingly seem exceptional. Amid a general exhaustion with militant and collectivist ideologies in western Europe, the virtues of democracy – above all, the respect for individual liberties – seemed clear. The practical advantages of a reworked social contract, and a welfare state, were also obvious. But neither these decades of relative stability, nor the collapse of communist regimes in 1989, were a reason to assume that human rights and democracy were rooted in European soil.

Instead of remembering the first world war in a way that flatters our contemporary prejudices, we should recall what Hannah Arendt pointed out in The Origins of Totalitarianism – one of the west’s first major reckonings with Europe’s grievous 20th-century experience of wars, racism and genocide. Arendt observes that it was Europeans who initially reordered “humanity into master and slave races” during their conquest and exploitation of much of Asia, Africa and America. This debasing hierarchy of races was established because the promise of equality and liberty at home required imperial expansion abroad in order to be even partially fulfilled. We tend to forget that imperialism, with its promise of land, food and raw materials, was widely seen in the late 19th century as crucial to national progress and prosperity. Racism was – and is – more than an ugly prejudice, something to be eradicated through legal and social proscription. It involved real attempts to solve, through exclusion and degradation, the problems of establishing political order, and pacifying the disaffected, in societies roiled by rapid social and economic change.

Senegalese soldiers serving in the French army on the western front in June 1917. Photograph: Galerie Bilderwelt/Getty Images

In the early 20th century, the popularity of social Darwinism had created a consensus that nations should be seen similarly to biological organisms, which risked extinction or decay if they failed to expel alien bodies and achieve “living space” for their own citizens. Pseudo-scientific theories of biological difference between races posited a world in which all races were engaged in an international struggle for wealth and power. Whiteness became “the new religion”, as Du Bois witnessed, offering security amid disorienting economic and technological shifts, and a promise of power and authority over a majority of the human population.

The resurgence of these supremacist views today in the west – alongside the far more widespread stigmatisation of entire populations as culturally incompatible with white western peoples – should suggest that the first world war was not, in fact, a profound rupture with Europe’s own history. Rather it was, as Liang Qichao, China’s foremost modern intellectual, was already insisting in 1918, a “mediating passage that connects the past and the future”.
The liturgies of Remembrance Day, and evocations of the beautiful long summer of 1913, deny both the grim reality that preceded the war and the way it has persisted into the 21st century. Our complex task during the war’s centenary is to identify the ways in which that past has infiltrated our present, and how it threatens to shape the future: how the terminal weakening of white civilisation’s domination, and the assertiveness of previously sullen peoples, has released some very old tendencies and traits in the west.

Nearly a century after first world war ended, the experiences and perspectives of its non-European actors and observers remain largely obscure. Most accounts of the war uphold it as an essentially European affair: one in which the continent’s long peace is shattered by four years of carnage, and a long tradition of western rationalism is perverted.

Relatively little is known about how the war accelerated political struggles across Asia and Africa; how Arab and Turkish nationalists, Indian and Vietnamese anti-colonial activists found new opportunities in it; or how, while destroying old empires in Europe, the war turned Japan into a menacing imperialist power in Asia.

A broad account of the war that is attentive to political conflicts outside Europe can clarify the hyper-nationalism today of many Asian and African ruling elites, most conspicuously the Chinese regime, which presents itself as avengers of China’s century-long humiliation by the west.

Recent commemorations have made greater space for the non-European soldiers and battlefields of the first world war: altogether more than four million non-white men were mobilised into European and American armies, and fighting happened in places very remote from Europe – from Siberia and east Asia to the Middle East, sub-Saharan Africa, and even the South Pacific islands. In Mesopotamia, Indian soldiers formed a majority of Allied manpower throughout the war. Neither Britain’s occupation of Mesopotamia nor its successful campaign in Palestine would have occurred without Indian assistance. Sikh soldiers even helped the Japanese to evict Germans from their Chinese colony of Qingdao. 

Scholars have started to pay more attention to the nearly 140,000 Chinese and Vietnamese contract labourers hired by the British and French governments to maintain the war’s infrastructure, mostly digging trenches. We know more about how interwar Europe became host to a multitude of anticolonial movements; the east Asian expatriate community in Paris at one point included Zhou Enlai, later the premier of China, as well as Ho Chi Minh. Cruel mistreatment, in the form of segregation and slave labour, was the fate of many of these Asians and Africans in Europe. Deng Xiaoping, who arrived in France just after the war, later recalled “the humiliations” inflicted upon fellow Chinese by “the running dogs of capitalists”.

But in order to grasp the current homecoming of white supremacism in the west, we need an even deeper history – one that shows how whiteness became in the late 19th century the assurance of individual identity and dignity, as well as the basis of military and diplomatic alliances.

Such a history would show that the global racial order in the century preceding 1914 was one in which it was entirely natural for “uncivilised” peoples to be exterminated, terrorised, imprisoned, ostracised or radically re-engineered. Moreover, this entrenched system was not something incidental to the first world war, with no connections to the vicious way it was fought or to the brutalisation that made possible the horrors of the Holocaust. Rather, the extreme, lawless and often gratuitous violence of modern imperialism eventually boomeranged on its originators.

In this new history, Europe’s long peace is revealed as a time of unlimited wars in Asia, Africa and the Americas. These colonies emerge as the crucible where the sinister tactics of Europe’s brutal 20th-century wars – racial extermination, forced population transfers, contempt for civilian lives – were first forged. Contemporary historians of German colonialism (an expanding field of study) try to trace the Holocaust back to the mini-genocides Germans committed in their African colonies in the 1900s, where some key ideologies, such as Lebensraum, were also nurtured. But it is too easy to conclude, especially from an Anglo-American perspective, that Germany broke from the norms of civilisation to set a new standard of barbarity, strong-arming the rest of the world into an age of extremes. For there were deep continuities in the imperialist practices and racial assumptions of European and American powers.

Indeed, the mentalities of the western powers converged to a remarkable degree during the high noon of “whiteness” – what Du Bois, answering his own question about this highly desirable condition, memorably defined as “the ownership of the Earth for ever and ever”. For example, the German colonisation of south-west Africa, which was meant to solve the problem of overpopulation, was often assisted by the British, and all major western powers amicably sliced and shared the Chinese melon in the late 19th century. Any tensions that arose between those dividing the booty of Asia and Africa were defused largely peacefully, if at the expense of Asians and Africans.

Campaigners calling for the removal of a statue of British imperialist Cecil Rhodes (upper right) at Oriel College in Oxford. Photograph: Martin Godwin for the Guardian

This is because colonies had, by the late 19th century, come to be widely seen as indispensable relief-valves for domestic socio-economic pressures. Cecil Rhodes put the case for them with exemplary clarity in 1895 after an encounter with angry unemployed men in London’s East End. Imperialism, he declared, was a “solution for the social problem, ie in order to save the 40 million inhabitants of the United Kingdom from a bloody civil war, we colonial statesmen must acquire new lands to settle the surplus population, to provide new markets for the goods produced in the factories and mines”. In Rhodes’ view, “if you want to avoid civil war, you must become imperialists”.

Rhodes’ scramble for Africa’s gold fields helped trigger the second Boer war, during which the British, interning Afrikaner women and children, brought the term “concentration camp” into ordinary parlance. By the end of the war in 1902, it had become a “commonplace of history”, JA Hobson wrote, that “governments use national animosities, foreign wars and the glamour of empire-making in order to bemuse the popular mind and divert rising resentment against domestic abuses”.

With imperialism opening up a “panorama of vulgar pride and crude sensationalism”, ruling classes everywhere tried harder to “imperialise the nation”, as Arendt wrote. This project to “organise the nation for the looting of foreign territories and the permanent degradation of alien peoples” was quickly advanced through the newly established tabloid press. The Daily Mail, right from its inception in 1896, stoked vulgar pride in being white, British and superior to the brutish natives – just as it does today.

At the end of the war, Germany was stripped of its colonies and accused by the victorious imperial powers, entirely without irony, of ill-treating its natives in Africa. But such judgments, still made today to distinguish a “benign” British and American imperialism from the German, French, Dutch and Belgian versions, try to suppress the vigorous synergies of racist imperialism. Marlow, the narrator of Joseph Conrad’s Heart of Darkness (1899), is clear-sighted about them: “All Europe contributed to the making of Kurtz,” he says. And to the new-fangled modes of exterminating the brutes, he might have added.

In 1920, a year after condemning Germany for its crimes against Africans, the British devised aerial bombing as routine policy in their new Iraqi possession – the forerunner to today’s decade-long bombing and drone campaigns in west and south Asia. “The Arab and Kurd now know what real bombing means,” a 1924 report by a Royal Air Force officer put it. “They now know that within 45 minutes a full-sized village … can be practically wiped out and a third of its inhabitants killed or injured.” This officer was Arthur “Bomber” Harris, who in the second world war unleashed the firestorms of Hamburg and Dresden, and whose pioneering efforts in Iraq helped German theorising in the 1930s about der totale krieg (the total war).

It is often proposed that Europeans were indifferent to or absent-minded about their remote imperial possessions, and that only a few dyed-in-the-wool imperialists like Rhodes, Kipling and Lord Curzon cared enough about them. This makes racism seem like a minor problem that was aggravated by the arrival of Asian and African immigrants in post-1945 Europe. But the frenzy of jingoism with which Europe plunged into a bloodbath in 1914 speaks of a belligerent culture of imperial domination, a macho language of racial superiority, that had come to bolster national and individual self-esteem.

Italy actually joined Britain and France on the Allied side in 1915 in a fit of popular empire-mania (and promptly plunged into fascism after its imperialist cravings went unslaked). Italian writers and journalists, as well as politicians and businessmen, had lusted after imperial power and glory since the late 19th century. Italy had fervently scrambled for Africa, only to be ignominiously routed by Ethiopia in 1896. (Mussolini would avenge that in 1935 by dousing Ethiopians with poison gas.) In 1911, it saw an opportunity to detach Libya from the Ottoman empire. Coming after previous setbacks, its assault on the country, greenlighted by both Britain and France, was vicious and loudly cheered at home. News of the Italians’ atrocities, which included the first bombing from air in history, radicalised many Muslims across Asia and Africa. But public opinion in Italy remained implacably behind the imperial gamble.

Germany’s own militarism, commonly blamed for causing Europe’s death spiral between 1914 and 1918, seems less extraordinary when we consider that from the 1880s, many Germans in politics, business and academia, and such powerful lobby groups as the Pan-German League (Max Weber was briefly a member), had exhorted their rulers to achieve the imperial status of Britain and France. Furthermore, all Germany’s military engagements from 1871 to 1914 occurred outside Europe. These included punitive expeditions in the African colonies and one ambitious foray in 1900 in China, where Germany joined seven other European powers in a retaliatory expedition against young Chinese who had rebelled against western domination of the Middle Kingdom. Troops under German command in Dar es Salaam, Tanzania (then part of German East Africa), circa 1914. Photograph: Hulton Archive/Getty Images

Dispatching German troops to Asia, the Kaiser presented their mission as racial vengeance: “Give no pardon and take no prisoners,” he said, urging the soldiers to make sure that “no Chinese will ever again even dare to look askance at a German”. The crushing of the “Yellow Peril” (a phrase coined in the 1890s) was more or less complete by the time the Germans arrived. Nevertheless, between October 1900 and spring 1901 the Germans launched dozens of raids in the Chinese countryside that became notorious for their intense brutality.

One of the volunteers for the disciplinary force was Lt Gen Lothar von Trotha, who had made his reputation in Africa by slaughtering natives and incinerating villages. He called his policy “terrorism”, adding that it “can only help” to subdue the natives. In China, he despoiled Ming graves and presided over a few killings, but his real work lay ahead, in German South-West Africa (contemporary Namibia) where an anti-colonial uprising broke out in January 1904. In October of that year, Von Trotha ordered that members of the Herero community, including women and children, who had already been defeated militarily, were to be shot on sight and those escaping death were to be driven into the Omaheke Desert, where they would be left to die from exposure. An estimated 60,000-70,000 Herero people, out of a total of approximately 80,000, were eventually killed, and many more died in the desert from starvation. A second revolt against German rule in south-west Africa by the Nama people led to the demise, by 1908, of roughly half of their population.

Such proto-genocides became routine during the last years of European peace. Running the Congo Free State as his personal fief from 1885 to 1908, King Leopold II of Belgium reduced the local population by half, sending as many as eight million Africans to an early death. The American conquest of the Philippines between 1898 and 1902, to which Kipling dedicated The White Man’s Burden, took the lives of more than 200,000 civilians. The death toll perhaps seems less startling when one considers that 26 of the 30 US generals in the Philippines had fought in wars of annihilation against Native Americans at home. One of them, Brigadier General Jacob H Smith, explicitly stated in his order to the troops that “I want no prisoners. I wish you to kill and burn. The more you kill and burn the better it will please me”. In a Senate hearing on the atrocities in the Philippines, General Arthur MacArthur (father of Douglas) referred to the “magnificent Aryan peoples” he belonged to and the “unity of the race” he felt compelled to uphold.

The modern history of violence shows that ostensibly staunch foes have never been reluctant to borrow murderous ideas from one another. To take only one instance, the American elite’s ruthlessness with blacks and Native Americans greatly impressed the earliest generation of German liberal imperialists, decades before Hitler also came to admire the US’s unequivocally racist policies of nationality and immigration. The Nazis sought inspiration from Jim Crow legislation in the US south, which makes Charlottesville, Virginia, a fitting recent venue for the unfurling of swastika banners and chants of “blood and soil”.

In light of this shared history of racial violence, it seems odd that we continue to portray the first world war as a battle between democracy and authoritarianism, as a seminal and unexpected calamity. The Indian writer Aurobindo Ghose was one among many anticolonial thinkers who predicted, even before the outbreak of war, that “vaunting, aggressive, dominant Europe” was already under “a sentence of death”, awaiting “annihilation” – much as Liang Qichao could see, in 1918, that the war would prove to be a bridge connecting Europe’s past of imperial violence to its future of merciless fratricide.

These shrewd assessments were not Oriental wisdom or African clairvoyance. Many subordinate peoples simply realised, well before Arendt published The Origins of Totalitarianism in 1951, that peace in the metropolitan west depended too much on outsourcing war to the colonies.

The experience of mass death and destruction, suffered by most Europeans only after 1914, was first widely known in Asia and Africa, where land and resources were forcefully usurped, economic and cultural infrastructure systematically destroyed, and entire populations eliminated with the help of up-to-date bureaucracies and technologies. Europe’s equilibrium was parasitic for too long on disequilibrium elsewhere.

In the end, Asia and Africa could not remain a safely remote venue for Europe’s wars of aggrandisement in the late 19th and 20th century. Populations in Europe eventually suffered the great violence that had long been inflicted on Asians and Africans. As Arendt warned, violence administered for the sake of power “turns into a destructive principle that will not stop until there is nothing left to violate”.

In our own time, nothing better demonstrates this ruinous logic of lawless violence, which corrupts both public and private morality, than the heavily racialised war on terror. It presumes a sub-human enemy who must be “smoked out” at home and abroad – and it has licensed the use of torture and extrajudicial execution, even against western citizens.

But, as Arendt predicted, its failures have only produced an even greater dependence on violence, a proliferation of undeclared wars and new battlefields, a relentless assault on civil rights at home – and an exacerbated psychology of domination, presently manifest in Donald Trump’s threats to trash the nuclear deal with Iran and unleash on North Korea “fire and fury like the world has never seen”.

It was always an illusion to suppose that “civilised” peoples could remain immune, at home, to the destruction of morality and law in their wars against barbarians abroad. But that illusion, long cherished by the self-styled defenders of western civilisation, has now been shattered, with racist movements ascendant in Europe and the US, often applauded by the white supremacist in the White House, who is making sure there is nothing left to violate.

The white nationalists have junked the old rhetoric of liberal internationalism, the preferred language of the western political and media establishment for decades. Instead of claiming to make the world safe for democracy, they nakedly assert the cultural unity of the white race against an existential threat posed by swarthy foreigners, whether these are citizens, immigrants, refugees, asylum-seekers or terrorists.

But the global racial order that for centuries bestowed power, identity, security and status on its beneficiaries has finally begun to break down. Not even war with China, or ethnic cleansing in the west, will restore to whiteness its ownership of the Earth for ever and ever. Regaining imperial power and glory has already proven to be a treacherous escapist fantasy – devastating the Middle East and parts of Asia and Africa while bringing terrorism back to the streets of Europe and America – not to mention ushering Britain towards Brexit.

No rousing quasi-imperialist ventures abroad can mask the chasms of class and education, or divert the masses, at home. Consequently, the social problem appears insoluble; acrimoniously polarised societies seem to verge on the civil war that Rhodes feared; and, as Brexit and Trump show, the capacity for self-harm has grown ominously.

This is also why whiteness, first turned into a religion during the economic and social uncertainty that preceded the violence of 1914, is the world’s most dangerous cult today. Racial supremacy has been historically exercised through colonialism, slavery, segregation, ghettoisation, militarised border controls and mass incarceration. It has now entered its last and most desperate phase with Trump in power.

We can no longer discount the “terrible probability” James Baldwin once described: that the winners of history, “struggling to hold on to what they have stolen from their captives, and unable to look into their mirror, will precipitate a chaos throughout the world which, if it does not bring life on this planet to an end, will bring about a racial war such as the world has never seen”. Sane thinking would require, at the very least, an examination of the history – and stubborn persistence – of racist imperialism: a reckoning that Germany alone among western powers has attempted.

Certainly the risk of not confronting our true history has never been as clear as on this Remembrance Day. If we continue to evade it, historians a century from now may once again wonder why the west sleepwalked, after a long peace, into its biggest calamity yet.