Search This Blog

Showing posts with label cynic. Show all posts
Showing posts with label cynic. Show all posts

Sunday, 15 July 2018

The Death of Truth - How Trump and Modi came to power

Michiko Kakutani in The Guardian

Two of the most monstrous regimes in human history came to power in the 20th century, and both were predicated on the violation and despoiling of truth, on the knowledge that cynicism and weariness and fear can make people susceptible to the lies and false promises of leaders bent on unconditional power. As Hannah Arendt wrote in her 1951 book The Origins of Totalitarianism, “The ideal subject of totalitarian rule is not the convinced Nazi or the convinced communist, but people for whom the distinction between fact and fiction (ie the reality of experience) and the distinction between true and false (ie the standards of thought) no longer exist.”

Arendt’s words increasingly sound less like a dispatch from another century than a chilling description of the political and cultural landscape we inhabit today – a world in which fake news and lies are pumped out in industrial volume by Russian troll factories, emitted in an endless stream from the mouth and Twitter feed of the president of the United States, and sent flying across the world through social media accounts at lightning speed. Nationalism, tribalism, dislocation, fear of social change and the hatred of outsiders are on the rise again as people, locked in their partisan silos and filter bubbles, are losing a sense of shared reality and the ability to communicate across social and sectarian lines.

This is not to draw a direct analogy between today’s circumstances and the overwhelming horrors of the second world war era, but to look at some of the conditions and attitudes – what Margaret Atwood has called the “danger flags” in George Orwell’s Nineteen Eighty-Four and Animal Farm – that make a people susceptible to demagoguery and political manipulation, and nations easy prey for would-be autocrats. To examine how a disregard for facts, the displacement of reason by emotion, and the corrosion of language are diminishing the value of truth, and what that means for the world.


Trump made 2,140 false or misleading claims during his first year in office – an average of 5.9 a day


The term “truth decay” has joined the post-truth lexicon that includes such now familiar phrases as “fake news” and “alternative facts”. And it’s not just fake news either: it’s also fake science (manufactured by climate change deniers and anti-vaxxers, who oppose vaccination), fake history (promoted by Holocaust revisionists and white supremacists), fake Americans on Facebook (created by Russian trolls), and fake followers and “likes” on social media (generated by bots).

Donald Trump, the 45th president of the US, lies so prolifically and with such velocity that the Washington Post calculated he’d made 2,140 false or misleading claims during his first year in office – an average of 5.9 a day. His lies – about everything from the investigations into Russian interference in the election, to his popularity and achievements, to how much TV he watches – are only the brightest blinking red light among many warnings of his assault on democratic institutions and norms. He routinely assails the press, the justice system, the intelligence agencies, the electoral system and the civil servants who make the US government tick.

Nor is the assault on truth confined to America. Around the world, waves of populism and fundamentalism are elevating appeals to fear and anger over reasoned debate, eroding democratic institutions, and replacing expertise with the wisdom of the crowd. False claims about the UK’s financial relationship with the EU helped swing the vote in favour of Brexit, and Russia ramped up its sowing of dezinformatsiya in the runup to elections in France, Germany, the Netherlands and other countries in concerted propaganda efforts to discredit and destabilise democracies.

How did this happen? How did truth and reason become such endangered species, and what does the threat to them portend for our public discourse and the future of our politics and governance? 

It’s easy enough to see Trump as having ascended to office because of a unique, unrepeatable set of factors: a frustrated electorate still hurting from the backwash of the 2008 financial crash; Russian interference in the election and a deluge of pro-Trump fake news stories on social media; a highly polarising opponent who came to symbolise the Washington elite that populists decried; and an estimated $5bn‑worth of free campaign coverage from media outlets obsessed with the views and clicks that the former reality TV star generated.

If a novelist had concocted a villain like Trump – a larger-than-life, over-the-top avatar of narcissism, mendacity, ignorance, prejudice, boorishness, demagoguery and tyrannical impulses (not to mention someone who consumes as many as a dozen Diet Cokes a day) – she or he would likely be accused of extreme contrivance and implausibility. In fact, the president of the US often seems less like a persuasive character than some manic cartoon artist’s mashup of Ubu Roi, Triumph the Insult Comic Dog, and a character discarded by Molière. But the more clownish aspects of Trump the personality should not blind us to the monumentally serious consequences of his assault on truth and the rule of law, and the vulnerabilities he has exposed in our institutions and digital communications. It is unlikely that a candidate who had already been exposed during the campaign for his history of lying and deceptive business practices would have gained such popular support were portions of the public not blase about truth-telling and were there not systemic problems with how people get their information and how they’ve come to think in increasingly partisan terms.


For decades, objectivity – or even the aim of ascertaining the best available truth – has been falling out of favour


With Trump, the personal is political, and in many respects he is less a comic-book anomaly than an extreme, bizarro-world apotheosis of many of the broader, intertwined attitudes undermining truth today, from the merging of news and politics with entertainment, to the toxic polarisation that’s overtaken American politics, to the growing populist contempt for expertise.

For decades now, objectivity – or even the idea that people can aspire toward ascertaining the best available truth – has been falling out of favour. Daniel Patrick Moynihan’s well-known observation that “Everyone is entitled to his own opinion, but not to his own facts” is more timely than ever: polarisation has grown so extreme that voters have a hard time even agreeing on the same facts. This has been exponentially accelerated by social media, which connects users with like-minded members and supplies them with customised news feeds that reinforce their preconceptions, allowing them to live in ever narrower silos.

For that matter, relativism has been ascendant since the culture wars began in the 1960s. Back then, it was embraced by the New Left, who were eager to expose the biases of western, bourgeois, male-dominated thinking; and by academics promoting the gospel of postmodernism, which argued that there are no universal truths, only smaller personal truths – perceptions shaped by the cultural and social forces of one’s day. Since then, relativistic arguments have been hijacked by the populist right.

Relativism, of course, synced perfectly with the narcissism and subjectivity that had been on the rise, from Tom Wolfe’s “Me Decade” 1970s, on through the selfie age of self-esteem. No surprise then that the “Rashomon effect” – the point of view that everything depends on your point of view – has permeated our culture, from popular novels such as Lauren Groff’s Fates and Furies to television series like The Affair, which hinge on the idea of competing realities.


 History is reimagined in Oliver Stone’s 1991 film JFK. Photograph: Allstar/Cinetext/Warner Bros

I’ve been reading and writing about many of these issues for nearly four decades, going back to the rise of deconstruction and battles over the literary canon on college campuses; debates over the fictionalised retelling of history in movies such as Oliver Stone’s JFK and Kathryn Bigelow’s Zero Dark Thirty; efforts made by both the Clinton and Bush administrations to avoid transparency and define reality on their own terms; Trump’s war on language and efforts to normalise the abnormal; and the impact that technology has had on how we process and share information.

In his 2007 book, The Cult of the Amateur, the Silicon Valley entrepreneur Andrew Keen warned that the internet not only had democratised information beyond people’s wildest imaginings but also was replacing genuine knowledge with “the wisdom of the crowd”, dangerously blurring the lines between fact and opinion, informed argument and blustering speculation. A decade later, the scholar Tom Nichols wrote in The Death of Expertise that a wilful hostility towards established knowledge had emerged on both the right and the left, with people aggressively arguing that “every opinion on any matter is as good as every other”. Ignorance was now fashionable.

The postmodernist argument that all truths are partial (and a function of one’s perspective) led to the related argument that there are many legitimate ways to understand or represent an event. This both encouraged a more egalitarian discourse and made it possible for the voices of the previously disfranchised to be heard. But it has also been exploited by those who want to make the case for offensive or debunked theories, or who want to equate things that cannot be equated. Creationists, for instance, called for teaching “intelligent design” alongside evolution in schools. “Teach both,” some argued. Others said, “Teach the controversy.”


Doubt is our product, since it is the best means of competing with the ‘body of fact’ that exists in the minds of the publicTobacco industry executive memo, 1969


A variation on this “both sides” argument was employed by Trump when he tried to equate people demonstrating against white supremacy with the neo-Nazis who had converged in Charlottesville, Virginia, to protest the removal of Confederate statues. There were “some very fine people on both sides”, Trump declared. He also said, “We condemn in the strongest possible terms this egregious display of hatred, bigotry and violence on many sides, on many sides.”

Climate deniers, anti-vaxxers and other groups who don’t have science on their side bandy about phrases that wouldn’t be out of place in a college class on deconstruction – phrases such as “many sides,” “different perspectives”, “uncertainties”, “multiple ways of knowing.” As Naomi Oreskes and Erik M Conway demonstrated in their 2010 book Merchants of Doubt, rightwing thinktanks, the fossil fuel industry, and other corporate interests that are intent on discrediting science have employed a strategy first used by the tobacco industry to try to confuse the public about the dangers of smoking. “Doubt is our product,” read an infamous memo written by a tobacco industry executive in 1969, “since it is the best means of competing with the ‘body of fact’ that exists in the minds of the general public.”

The strategy, essentially, was this: dig up a handful of so-called professionals to refute established science or argue that more research is needed; turn these false arguments into talking points and repeat them over and over; and assail the reputations of the genuine scientists on the other side. If this sounds familiar, that’s because it’s a tactic that’s been used by Trump and his Republican allies to defend policies (on matters ranging from gun control to building a border wall) that run counter to both expert evaluation and national polls.


In January 2018, protests were held in 50 states urging US senators to support scientific evidence against Trump’s climate change policies. Photograph: Pacific Press/LightRocket via Getty Images

What Oreskes and Conway call the “tobacco strategy” was helped, they argued, by elements in the mainstream media that tended “to give minority views more credence than they deserve”. This false equivalence was the result of journalists confusing balance with truth-telling, wilful neutrality with accuracy; caving in to pressure from rightwing interest groups to present “both sides”; and the format of television news shows that feature debates between opposing viewpoints – even when one side represents an overwhelming consensus and the other is an almost complete outlier in the scientific community. For instance, a 2011 BBC Trust report found that the broadcaster’s science coverage paid “undue attention to marginal opinion” on the subject of manmade climate change. Or, as a headline in the Telegraph put it, “BBC staff told to stop inviting cranks on to science programmes”.

In a speech on press freedom, CNN’s chief international correspondent Christiane Amanpour addressed this issue in the context of media coverage of the 2016 presidential race, saying: “It appeared much of the media got itself into knots trying to differentiate between balance, objectivity, neutrality, and crucially, truth … I learned long ago, covering the ethnic cleansing and genocide in Bosnia, never to equate victim with aggressor, never to create a false moral or factual equivalence, because then you are an accomplice to the most unspeakable crimes and consequences. I believe in being truthful, not neutral. And I believe we must stop banalising the truth.”

As the west lurched through the cultural upheavals of the 1960s and 1970s and their aftermath, artists struggled with how to depict this fragmenting reality. Some writers like John Barth, Donald Barthelme and William Gass created self-conscious, postmodernist fictions that put more emphasis on form and language than on conventional storytelling. Others adopted a minimalistic approach, writing pared-down, narrowly focused stories emulating the fierce concision of Raymond Carver. And as the pursuit of broader truths became more and more unfashionable in academia, and as daily life came to feel increasingly unmoored, some writers chose to focus on the smallest, most personal truths: they wrote about themselves.

American reality had become so confounding, Philip Roth wrote in a 1961 essay, that it felt like “a kind of embarrassment to one’s own meager imagination”. This had resulted, he wrote, in the “voluntary withdrawal of interest by the writer of fiction from some of the grander social and political phenomena of our times”, and the retreat, in his own case, to the more knowable world of the self.


Real estate and realism … Bruce Willis in the 1990 film version of The Bonfire of the Vanities. Photograph: Allstar/WARNER BROS.

In a controversial 1989 essay, Tom Wolfe lamented these developments, mourning what he saw as the demise of old-fashioned realism in American fiction, and he urged novelists to “head out into this wild, bizarre, unpredictable, hog-stomping Baroque country of ours and reclaim it as literary property”. He tried this himself in novels such as The Bonfire of the Vanities and A Man in Full, using his skills as a reporter to help flesh out a spectrum of subcultures with Balzacian detail. But while Wolfe had been an influential advocate in the 1970s of the New Journalism (which put an emphasis on the voice and point of view of the reporter), his new manifesto didn’t win many converts in the literary world. Instead, writers as disparate as Louise Erdrich, David Mitchell, Don DeLillo, Julian Barnes, Chuck Palahniuk, Gillian Flynn and Groff would play with devices (such as multiple points of view, unreliable narrators and intertwining storylines) pioneered decades ago by innovators such as William Faulkner, Virginia Woolf, Ford Madox Ford and Vladimir Nabokov to try to capture the new Rashomon-like reality in which subjectivity rules and, in the infamous words of former president Bill Clinton, truth “depends on what the meaning of the word ‘is’ is”.

But what Roth called “the sheer fact of self, the vision of self as inviolable, powerful, and nervy, self as the only real thing in an unreal environment” would remain more comfortable territory for many writers. In fact, it would lead, at the turn of the millennium, to a remarkable flowering of memoir writing, including such classics as Mary Karr’s The Liars’ Club and Dave Eggers’s A Heartbreaking Work of Staggering Genius – books that established their authors as among the foremost voices of their generation. The memoir boom and the popularity of blogging would eventually culminate in Karl Ove Knausgaard’s six-volume autobiographical novel, My Struggle – filled with minutely detailed descriptions, drawn from the author’s own daily life.

Personal testimony also became fashionable on college campuses, as the concept of objective truth fell out of favour and empirical evidence gathered by traditional research came to be regarded with suspicion. Academic writers began prefacing scholarly papers with disquisitions on their own “positioning” – their race, religion, gender, background, personal experiences that might inform or skew or ratify their analysis.


Social networks give people news that is popular and trending rather than accurate or important

In a 2016 documentary titled HyperNormalisation, the filmmaker Adam Curtis created an expressionistic, montage-driven meditation on life in the post-truth era; the title was taken from a term coined by the anthropologist Alexei Yurchak to describe life in the final years of the Soviet Union, when people both understood the absurdity of the propaganda the government had been selling them for decades and had difficulty envisioning any alternative. In HyperNormalisation, which was released shortly before the 2016 US election, Curtis says in voiceover narration that people in the west had also stopped believing the stories politicians had been telling them for years, and Trump realised that “in the face of that, you could play with reality” and in the process “further undermine and weaken the old forms of power”.

Some Trump allies on the far right also seek to redefine reality on their own terms. Invoking the iconography of the movie The Matrix – in which the hero is given a choice between two pills, a red one (representing knowledge and the harsh truths of reality) and a blue one (representing soporific illusion and denial) – members of the “alt-right” and some aggrieved men’s rights groups talk about “red-pilling the normies”, which means converting people to their cause. In other words, selling their inside-out alternative reality, in which white people are suffering from persecution, multiculturalism poses a grave threat and men have been oppressed by women.

Alice Marwick and Rebecca Lewis, the authors of a study on online disinformation, argue that “once groups have been red-pilled on one issue, they’re likely to be open to other extremist ideas. Online cultures that used to be relatively nonpolitical are beginning to seethe with racially charged anger. Some sci-fi, fandom, and gaming communities – having accepted run-of-the-mill antifeminism – are beginning to espouse white-nationalist ideas. ‘Ironic’ Nazi iconography and hateful epithets are becoming serious expressions of antisemitism.”


Some Trump allies on the far right invoke The Matrix to sell their inside‑out alternative reality

One of the tactics used by the alt-right to spread its ideas online, Marwick and Lewis argue, is to initially dilute more extreme views as gateway ideas to court a wider audience; among some groups of young men, they write, “it’s a surprisingly short leap from rejecting political correctness to blaming women, immigrants, or Muslims for their problems.”

Many misogynist and white supremacist memes, in addition to a lot of fake news, originate or gain initial momentum on sites such as 4chan and Reddit – before accumulating enough buzz to make the leap to Facebook and Twitter, where they can attract more mainstream attention. Renee DiResta, who studies conspiracy theories on the web, argues that Reddit can be a useful testing ground for bad actors – including foreign governments such as Russia’s – to try out memes or fake stories to see how much traction they get. DiResta warned in the spring of 2016 that the algorithms of social networks – which give people news that is popular and trending, rather than accurate or important – are helping to promote conspiracy theories.


There is an 'asymmetry of passion' on social media: most people won’t devote hours reinforcing the obvious. Extremists are committed to ‘wake up the sheeple’

This sort of fringe content can both affect how people think and seep into public policy debates on matters such as vaccines, zoning laws and water fluoridation. Part of the problem is an “asymmetry of passion” on social media: while most people won’t devote hours to writing posts that reinforce the obvious, DiResta says, “passionate truthers and extremists produce copious amounts of content in their commitment to ‘wake up the sheeple’”.

Recommendation engines, she adds, help connect conspiracy theorists with one another to the point that “we are long past merely partisan filter bubbles and well into the realm of siloed communities that experience their own reality and operate with their own facts”. At this point, she concludes, “the internet doesn’t just reflect reality any more; it shapes it”.

Language is to humans, the writer James Carroll once observed, what water is to fish: “We swim in language. We think in language. We live in language.” This is why Orwell wrote that “political chaos is connected with the decay of language”, divorcing words from meaning and opening up a chasm between a leader’s real and declared aims. This is why the US and the world feel so disoriented by the stream of lies issued by the Trump White House and the president’s use of language to disseminate distrust and discord. And this is why authoritarian regimes throughout history have co‑opted everyday language in an effort to control how people communicate – exactly the way the Ministry of Truth in Nineteen Eighty-Four aims to deny the existence of external reality and safeguard Big Brother’s infallibility.

Orwell’s “Newspeak” is a fictional language, but it often mirrors and satirises the “wooden language” imposed by communist authorities in the Soviet Union and eastern Europe. Among the characteristics of “wooden language” that the French scholar Françoise Thom identified in a 1987 thesis were abstraction and the avoidance of the concrete; tautologies (“the theories of Marx are true because they are correct”); bad metaphors (“the fascist octopus has sung its swan song”); and Manichaeism that divides the world into things good and things evil (and nothing in between).


‘Trump has performed the disturbing Orwellian trick of using words to mean the exact opposite of what they really mean.’ ... John Hurt in the film adaptation of Nineteen Eighty-Four. Photograph: Allstar/MGM

Trump has performed the disturbing Orwellian trick (“WAR IS PEACE”, “FREEDOM IS SLAVERY”, “IGNORANCE IS STRENGTH”) of using words to mean the exact opposite of what they really mean. It’s not just his taking the term “fake news”, turning it inside out, and using it to try to discredit journalism that he finds threatening or unflattering. He has also called the investigation into Russian election interference “the single greatest witch-hunt in American political history”, when he is the one who has repeatedly attacked the press, the justice department, the FBI, the intelligence services and any institution he regards as hostile.

In fact, Trump has the perverse habit of accusing opponents of the very sins he is guilty of himself: “Lyin’ Ted”, “Crooked Hillary”, “Crazy Bernie”. He accused Clinton of being “a bigot who sees people of colour only as votes, not as human beings worthy of a better future”, and he has asserted that “there was tremendous collusion on behalf of the Russians and the Democrats”.

In Orwell’s language of Newspeak, a word such as “blackwhite” has “two mutually contradictory meanings”: “Applied to an opponent, it means the habit of impudently claiming that black is white, in contradiction of the plain facts. Applied to a Party member, it means a loyal willingness to say that black is white when Party discipline demands this.”




Trump's inauguration crowd: Sean Spicer's claims versus the evidence


This, too, has an unnerving echo in the behaviour of Trump White House officials and Republican members of Congress who lie on the president’s behalf and routinely make pronouncements that flout the evidence in front of people’s eyes. The administration, in fact, debuted with the White House press secretary, Sean Spicer, insisting that Trump’s inaugural crowds were the “largest audience” ever – an assertion that defied photographic evidence and was rated by the fact-checking blog PolitiFact a “Pants on Fire” lie. These sorts of lies, the journalist Masha Gessen has pointed out, are told for the same reason that Vladimir Putin lies: “to assert power over truth itself”.

Trump has continued his personal assault on the English language. His incoherence (his twisted syntax, his reversals, his insincerity, his bad faith and his inflammatory bombast) is emblematic of the chaos he creates and thrives on, as well as an essential instrument in his liar’s toolkit. His interviews, off‑teleprompter speeches and tweets are a startling jumble of insults, exclamations, boasts, digressions, non sequiturs, qualifications, exhortations and innuendos – a bully’s efforts to intimidate, gaslight, polarise and scapegoat.

Precise words, like facts, mean little to Trump, as interpreters, who struggle to translate his grammatical anarchy, can attest. Chuck Todd, the anchor of NBC’s Meet the Press, observed that after several of his appearances as a candidate Trump would lean back in his chair and ask the control booth to replay his segment on a monitor – without sound: “He wants to see what it all looked like. He will watch the whole thing on mute.”
Protesters react to white nationalist Richard Spencer as he speaks at a college campus in Florida in 2017. Spencer participated in the Charlottesville Unite the Right rally earlier that year. Photograph: Joe Raedle/Getty Images

Philip Roth said he could never have imagined that “the 21st-century catastrophe to befall the USA, the most debasing of disasters”, would appear in “the ominously ridiculous commedia dell’arte figure of the boastful buffoon”. Trump’s ridiculousness, his narcissistic ability to make everything about himself, the outrageousness of his lies, and the profundity of his ignorance can easily distract attention from the more lasting implications of his story: how easily Republicans in Congress enabled him, undermining the whole concept of checks and balances set in place by the founders; how a third of the country passively accepted his assaults on the constitution; how easily Russian disinformation took root in a culture where the teaching of history and civics had seriously atrophied.

The US’s founding generation spoke frequently of the “common good”. George Washington reminded citizens of their “common concerns” and “common interests” and the “common cause” they had all fought for in the revolution. And Thomas Jefferson spoke in his inaugural address of the young country uniting “in common efforts for the common good”. A common purpose and a shared sense of reality mattered because they bound the disparate states and regions together, and they remain essential for conducting a national conversation. Especially today in a country where Trump and Russian and hard-right trolls are working to incite the very factionalism Washington warned us about, trying to inflame divisions between people along racial, ethnic and religious lines.

There are no easy remedies, but it’s essential that citizens defy the cynicism and resignation that autocrats and power-hungry politicians depend on to subvert resistance. Without commonly agreed-on facts – not Republican facts and Democratic facts; not the alternative facts of today’s silo-world – there can be no rational debate over policies, no substantive means of evaluating candidates for political office, and no way to hold elected officials accountable to the people. Without truth, democracy is hobbled

Friday, 24 October 2014

The truth about evil

John Gray in The Guardian
When Barack Obama vows to destroy Isis’s “brand of evil” and David Cameron declares t
hat Isis is an “evil organisation” that must be obliterated, they are echoing Tony Blair’s judgment of Saddam Hussein: “But the man’s uniquely evil, isn’t he?” Blair made this observation in November 2002, four months before the invasion of Iraq, when he invited six experts to Downing Street to brief him on the likely consequences of the war. The experts warned that Iraq was a complicated place, riven by deep communal enmities, which Saddam had dominated for over 35 years. Destroying the regime would leave a vacuum; the country could be shaken by Sunni rebellion and might well descend into civil war. These dangers left the prime minster unmoved. What mattered was Saddam’s moral iniquity. The divided society over which he ruled was irrelevant. Get rid of the tyrant and his regime, and the forces of good would prevail.
If Saddam was uniquely evil 12 years ago, we have it on the authority of our leaders that Isis is uniquely evil today. Until it swept into Iraq a few months ago, the jihadist group was just one of several that had benefited from the campaign being waged by western governments and their authoritarian allies in the Gulf in support of the Syrian opposition’s struggle to overthrow Bashar al-Assad. Since then Isis has been denounced continuously and with increasing intensity; but there has been no change in the ruthless ferocity of the group, which has always practised what a radical Islamist theorist writing under the name Abu Bakr Naji described in an internet handbook in 2006 as “the management of savagery”.
Ever since it was spun off from al-Qaida some 10 years ago, Isis has made clear its commitment to beheading apostates and unbelievers, enslaving women and wiping out communities that will not submit to its ultra-fundamentalist interpretation of Islam. In its carefully crafted internet videos, it has advertised these crimes itself. There has never been any doubt that Isis practises methodical savagery as an integral part of its strategy of war. This did not prevent an abortive attempt on the part of the American and British governments in August of last year to give military support to the Syrian rebels – a move that could have left Isis the most powerful force in the country. Isis became the prime enemy of western governments only when it took advantage of the anarchy these same governments had created when they broke the state of Iraq with their grandiose scheme of regime change.
Against this background, it would be easy to conclude that talk of evil in international conflicts is no more than a cynical technique for shaping public perceptions. That would be a mistake. Blair’s secret – which is the key to much in contemporary politics – is not cynicism. A cynic is someone who knowingly acts against what he or she knows to be true. Too morally stunted to be capable of the mendacity of which he is often accused, Blair thinks and acts on the premise that whatever furthers the triumph of what he believes to be good must be true. Imagining that he can deliver the Middle East and the world from evil, he cannot help having a delusional view of the impact of his policies.

Saddam Hussein fires a rifle
“But the man’s uniquely evil, isn’t he?” Tony Blair said of Saddam Hussein.Photograph: Reuters

Here Blair is at one with most western leaders. It’s not that they are obsessed with evil. Rather, they don’t really believe in evil as an enduring reality in human life. If their feverish rhetoric means anything, it is that evil can be vanquished. In believing this, those who govern us at the present time reject a central insight of western religion, which is found also in Greek tragic drama and the work of the Roman historians: destructive human conflict is rooted in flaws within human beings themselves. In this old-fashioned understanding, evil is a propensity to destructive and self-destructive behaviour that is humanly universal. The restraints of morality exist to curb this innate human frailty; but morality is a fragile artifice that regularly breaks down. Dealing with evil requires an acceptance that it never goes away.
No view of things could be more alien at the present time. Whatever their position on the political spectrum, almost all of those who govern us hold to some version of the melioristic liberalism that is the west’s default creed, which teaches that human civilisation is advancing – however falteringly – to a point at which the worst forms of human destructiveness can be left behind. According to this view, evil, if any such thing exists, is not an inbuilt human flaw, but a product of defective social institutions, which can over time be permanently improved.


Paradoxically, this belief in the evanescence of evil is what underlies the hysterical invocation of evil that has lately become so prominent. There are many bad and lamentable forces in the world today, but it is those that undermine the belief in human improvement that are demonised as “evil”. So what disturbs the west about Vladimir Putin, for example, is not so much the persecution of gay people over which he has presided, or the threat posed to Russia’s neighbours by his attempt to reassert its imperial power. It is the fact that he has no place in the liberal scheme of continuing human advance. As a result, the Russian leader can only be evil. When George W Bush looked into Putin’s eyes at a Moscow summit in May 2002, he reported, “I was able to get a sense of his soul”. When Joe Biden visited the Kremlin in 2011, he had a very different impression, telling Putin: “Mr Prime Minister, I’m looking into your eyes, and I don’t think you have a soul.” According to Biden, Putin smiled and replied, “We understand each other.” The religious language is telling: nine years earlier, Putin had been a pragmatic leader with whom the west could work; now he was a soulless devil.
It’s in the Middle East, however, that the prevailing liberal worldview has proved most consistently misguided. At bottom, it may be western leaders’ inability to think outside this melioristic creed that accounts for their failure to learn from experience. After more than a decade of intensive bombing, backed up by massive ground force, the Taliban continue to control much of Afghanistan and appear to be regaining ground as the American-led mission is run down. Libya – through which a beaming David Cameron processed in triumph only three years ago, after the use of western air power to help topple Gaddafi – is now an anarchic hell-hole that no western leader could safely visit. One might think such experiences would be enough to deter governments from further exercises in regime change. But our leaders cannot admit the narrow limits of their power. They cannot accept that by removing one kind of evil they may succeed only in bringing about another – anarchy instead of tyranny, Islamist popular theocracy instead of secular dictatorship. They need a narrative of continuing advance if they are to preserve their sense of being able to act meaningfully in the world, so they are driven again and again to re-enact their past failures.
Many view these western interventions as no more than exercises in geopolitics. But a type of moral infantilism is no less important in explaining the persisting folly of western governments. Though it is clear that Isis cannot be permanently weakened as long as the war against Assad continues, this fact is ignored – and not only because a western-brokered peace deal that left Assad in power would be opposed by the Gulf states that have sided with jihadist forces in Syria. More fundamentally, any such deal would mean giving legitimacy to a regime that western governments have condemned as more evil than any conceivable alternative. In Syria, the actual alternatives are the survival in some form of Assad’s secular despotism, a radical Islamist regime or continuing war and anarchy. In the liberal political culture that prevails in the west, a public choice among these options is impossible.
There are some who think the very idea of evil is an obsolete relic of religion. For most secular thinkers, what has been defined as evil in the past is the expression of social ills that can in principle be remedied. But these same thinkers very often invoke evil forces to account for humankind’s failure to advance. The secularisation of the modern moral vocabulary that many believed was under way has not occurred: public discourse about good and evil continues to be rooted in religion. Yet the idea of evil that is invoked is not one that features in the central religious traditions of the west. The belief that evil can be finally overcome has more in common with the dualistic heresies of ancient and medieval times than it does with any western religious orthodoxy.

* * *

A radically dualistic view of the world, in which good and evil are separate forces that have coexisted since the beginning of time, was held by the ancient Zoroastrians and Manicheans. These religions did not face the problem with which Christian apologists have struggled so painfully and for so long – how to reconcile the existence of an all-powerful and wholly good God with the fact of evil in the world. The worldview of George W Bush and Tony Blair is commonly described as Manichean, but this is unfair to the ancient religion. Mani, the third-century prophet who founded the faith, appears to have believed the outcome of the struggle was uncertain, whereas for Bush and Blair there could never be any doubt as to the ultimate triumph of good. In refusing to accept the permanency of evil they are no different from most western leaders.

Saint Augustine by Caravaggio.
Saint Augustine by Caravaggio.Photograph: The Guardian

The west owes its ideas of evil to Christianity, though whether these ideas would be recognised by Jesus – the dissident Jewish prophet from whose life and sayings St Paul conjured the Christian religion – is an open question. The personification of evil as a demonic presence is not a feature of biblical Judaism, where the figure of Satan appears chiefly as a messenger or accuser sent by God to challenge wrongdoers. Despite the claims of believers and advances in scholarship, not enough is known to pronounce with any confidence on what Jesus may himself have believed. What is clear is that Christianity has harboured a number of quite different understandings of evil.
A convert from Manicheism, St Augustine established a powerful orthodoxy in the fourth century when he tried to distance Christianity from dualism and maintained that evil was not an independent force coeval with good but came into the world when human beings misused the gift of free will. Reflecting Augustine’s own conflicts, the idea of original sin that he developed would play a part in the unhealthy preoccupation with sexuality that appears throughout most of Christianity’s history. Yet in placing the source of evil within human beings, Augustine’s account is more humane than myths in which evil is a sinister force that acts to subvert human goodness. Those who believe that evil can be eradicated tend to identify themselves with the good and attack anyone they believe stands in the way of its triumph.
Augustine had an immense influence, but dualistic views in which evil exists as an independent force have erupted repeatedly as heretical traditions within Christianity. The Cathar movement that developed in parts of Europe in the 13th century revived a Manichean cosmogony in which the world is the work not of a good God but instead of a malevolent angel or demi-urge. A rival heresy was promoted by the fourth century theologian Pelagius, an opponent of Augustine who denied original sin while strongly affirming free will, and believed that human beings could be good without divine intervention. More than any of the ancient Greek philosophers, Pelagius put an idea of human autonomy at the centre of his thinking. Though he is now almost forgotten, this heretical Christian theologian has a good claim to be seen as the true father of modern liberal humanism.
In its official forms, secular liberalism rejects the idea of evil. Many liberals would like to see the idea of evil replaced by a discourse of harm: we should talk instead about how people do damage to each other and themselves. But this view poses a problem of evil remarkably similar to that which has troubled Christian believers. If every human being is born a liberal – as these latter-day disciples of Pelagius appear to believe – why have so many, seemingly of their own free will, given their lives to regimes and movements that are essentially repressive, cruel and violent? Why do human beings knowingly harm others and themselves? Unable to account for these facts, liberals have resorted to a language of dark and evil forces much like that of dualistic religions.


The efforts of believers to explain why God permits abominable suffering and injustice have produced nothing that is convincing; but at least believers have admitted that the ways of the Deity are mysterious. Even though he ended up accepting the divine will, the questions that Job put to God were never answered. Despite all his efforts to find a solution, Augustine confessed that human reason was not equal to the task. In contrast, when secular liberals try to account for evil in rational terms, the result is a more primitive version of Manichean myth. When humankind proves resistant to improvement, it is because forces of darkness – wicked priests, demagogic politicians, predatory corporations and the like – are working to thwart the universal struggle for freedom and enlightenment. There is a lesson here. Sooner or later anyone who believes in innate human goodness is bound to reinvent the idea of evil in a cruder form. Aiming to exorcise evil from the modern mind, secular liberals have ended up constructing another version of demonology, in which anything that stands out against what is believed to be the rational course of human development is anathematised.
The view that evil is essentially banal, presented by Hannah Arendt in her book Eichmann in Jerusalem (1963), is another version of the modern evasion of evil. Arendt suggested that human beings commit atrocities from a kind of stupidity, falling into a condition of thoughtlessness in which they collude in practices that inflict atrocious suffering on other human beings. It was some such moral inertia, Arendt maintained, that enabled Eichmann to take a leading part in perpetrating the Holocaust. Arendt’s theory of the banality of evil tends to support the defence of his actions that Eichmann presented at his trial: he had no choice in doing what he did. She represented Eichmann as a colourless bureaucrat performing a well-defined function in an impersonal bureaucratic machine; but the Nazi state was in fact largely chaotic, with different institutions, departments of government and individuals competing for Hitler’s favour. Careful historical research of the kind that David Cesarani undertook in his book Eichmann: His Life and Crimes (2004) suggests that Eichmann was not a passive tool of the state, but chose to serve it. When he organised the deportation and mass murder of Jews, he wasn’t simply furthering his career in the Nazi hierarchy. What he did reflected his deep-seated antisemitism. Eichmann took part in the Holocaust because he wanted to do so. In this he was no different from many others, though his crimes were larger in scale.
No doubt something like the type of evil that Arendt identified is real enough. Large parts of the population in Germany went along with Nazi policies of racial persecution and genocide from motives that included social conformity and obedience to authority. The number of doctors, teachers and lawyers who refused to implement Nazi policies was vanishingly small. But again, this wasn’t only passive obedience. Until it became clear that Hitler’s war might be lost, Nazism was extremely popular. As the great American journalist William Shirer reported in his eyewitness account of the rise of Hitler, The Nightmare Years:
“Most Germans, so far as I could see, did not seem to mind that their personal freedom had been taken away, that so much of their splendid culture was being destroyed and replaced with a mindless barbarism, or that their life and work were being regimented to a degree never before experienced even by a people accustomed for generations to a great deal of regimentation … On the whole, people did not seem to feel that they were being cowed and held down by an unscrupulous tyranny. On the contrary, they appeared to support it with genuine enthusiasm.”
When large populations of human beings collude with repressive regimes it need not be from thoughtlessness or inertia. Liberal meliorists like to think that human life contains many things that are bad, some of which may never be entirely eliminated; but there is nothing that is intrinsically destructive or malevolent in human beings themselves – nothing, in other words, that corresponds to a traditional idea of evil. But another view is possible, and one that need make no call on theology.
What has been described as evil in the past can be understood as a natural tendency to animosity and destruction, co-existing in human beings alongside tendencies to sympathy and cooperation. This was the view put forward by Sigmund Freud in a celebrated exchange of letters with Albert Einstein in 1931-32. Einstein had asked: “Is it possible to control man’s mental evolution so as to make him proof against the psychosis of hate and destructiveness?” Freud replied that “there is no likelihood of our being able to suppress humanity’s aggressive tendencies”.
Freud suggested that human beings were ruled by impulses or instincts, eros and thanatos, impelling them towards life and creation or destruction and death. He cautioned against thinking that these forces embodied good and evil in any simple way. Whether they worked together or in opposition, both were necessary. Even so, Freud was clear that a major threat to anything that might be called a good life came from within human beings. The fragility of civilisation reflected the divided nature of the human animal itself.

The crowd at the 1936 Olympic Games in Berlin raise their hands in the Nazi salute in tribute to Hitler's arrival at the stadium.
The crowd at the 1936 Olympic Games in Berlin salute Hitler’s arrival at the stadium. Photograph: Bettmann/Corbis

One need not subscribe to Freud’s theory (which in the same letter he describes as a type of mythology) to think he was on to something here. Rather than psychoanalysis, it may be some version of evolutionary psychology that can best illuminate the human proclivity to hatred and destruction. The point is that destructive behaviour of this kind flows from inherent human flaws. Crucially, these defects are not only or even mainly intellectual. No advance in human knowledge can stop humans attacking and persecuting others. Poisonous ideologies like Nazi “scientific racism” justify such behaviour. But these ideologies are not just erroneous theories that can be discarded when their falsehood has been demonstrated. Ideas of similar kinds recur whenever societies are threatened by severe and continuing hardship. At present, antisemitism and ethnic nationalism, along with hatred of gay people, immigrants and other minorities, are re-emerging across much of the continent. Toxic ideologies express and reinforce responses to social conflict that are generically human.
Mass support for despotic regimes has many sources. Without the economic upheavals that ruined much of the German middle class, the Nazis might well have remained a fringe movement. Undoubtedly there were many who looked to the Nazi regime for protection against economic insecurity. But it is a mistake to suppose that when people turn to tyrants, they do so despite the crimes that tyrants commit. Large numbers have admired tyrannical regimes and actively endorsed their crimes. If Nazism had not existed, something like it would surely have been invented in the chaos of interwar Europe.

* * *

When the west aligned itself with the USSR in the second world war, it was choosing the lesser of two evils – both of them evils of a radical kind. This was the view of Winston Churchill, who famously said he would “sup with the devil” if doing so would help destroy “that evil man” Hitler. Churchill’s candid recognition of the nature of the choice he made is testimony to how shallow the discourse of evil has since become. Today, no western politician could admit to making such a decision.
In his profound study On Compromise and Rotten Compromises, the Israeli philosopher Avishai Margalit distinguishes between regimes that rest on cruelty and humiliation, as many have done throughout history, and those that go further by excluding some human beings altogether from moral concern. Describing the latter as radically evil, he argues that Nazi Germany falls into this category. The distinction Margalit draws is not a quantitative one based on the numbers of victims, but categorical: Nazi racism created an immutable hierarchy in which there could be no common moral bonds. Margalit goes on to argue – surely rightly – that in allying itself with the Soviet Union in the struggle against Nazism, the west was making a necessary and justified moral compromise. But this was not because the Nazis were the greater evil, he suggests. For all its oppression, the Soviet Union offered a vision of the future that included all of humankind. Viewing most of the species as less than human, Nazism rejected morality itself.
There should be no doubt that the Nazis are in a class by themselves. No other regime has launched a project of systematic extermination that is comparable. From the beginning of the Soviet system there were some camps from which it was difficult to emerge alive. Yet at no time was there anything in the Soviet gulag akin to the Nazi death camps that operated at Sobibor and Treblinka. Contrary to some in post-communist countries who try to deny the fact, the Holocaust remains a unique crime. Judged by Margalit’s formula, however, the Soviet Union was also implicated in radical evil. The Soviet state implemented a policy of exclusion from society of “former persons” – a group that included those who lived off unearned income, clergy of all religions and tsarist functionaries – who were denied civic rights, prohibited from seeking public office and restricted in their access to the rationing system. Many died of starvation or were consigned to camps where they perished from overwork, undernourishment and brutal treatment.
Considered as a broad moral category, what Margalit defines as radical evil is not uncommon. The colonial genocide of the Herero people in German South-West Africa (now Namibia) at the start of the 20th century was implemented against a background of ersatz-scientific racist ideology that denied the humanity of Africans. (The genocide included the use of Hereros as subjects of medical experiments, conducted by doctors some of whom returned to Germany to teach physicians later implicated in experiments on prisoners in Nazi camps.) The institution of slavery in antebellum America and South African apartheid rested on a similar denial. A refusal of moral standing to some of those they rule is a feature of societies of widely different varieties in many times and places. In one form or another, denying the shared humanity of others seems to be a universal human trait.

An Islamic State fighter in Raqqa, Iraq.
An Islamic State fighter in Raqqa, Syria. Photograph: Reuters

Describing Isis’s behaviour as “psychopathic”, as David Cameron has done, represents the group as being more humanly aberrant than the record allows. Aside from the fact that it publicises them on the internet, Isis’s atrocities are not greatly different from those that have been committed in many other situations of acute conflict. To cite only a few of the more recent examples, murder of hostages, mass killings and systematic rape have been used as methods of warfare in the former Yugoslavia, Chechnya, Rwanda, and the Congo.
A campaign of mass murder is never simply an expression of psychopathic aggression. In the case of Isis, the ideology of Wahhabism has played an important role. Ever since the 1920s, the rulers of the Saudi kingdom have promoted this 18th-century brand of highly repressive and exclusionary Sunni Islam as part of the project of legitimating the Saudi state. More recently, Saudi sponsorship of Wahhabi ideology has been a response to the threat posed by the rise of Shia Iran. If the ungoverned space in which Isis operates has been created by the west’s exercises in regime change, the group’s advances are also a byproduct of the struggle for hegemony between Iran and the Saudis. In such conditions of intense geopolitical rivalry there can be no effective government in Iraq, no end to the Syrian civil war and no meaningful regional coalition against the self-styled caliphate.
But the rise of Isis is also part of a war of religion. Nothing is more commonplace than the assertion that religion is a tool of power, which ruling elites use to control the people. No doubt that’s often true. But a contrary view is also true: politics may be a continuation of religion by other means. In Europe religion was a primary force in politics for many centuries. When religion seemed to be in retreat, it renewed itself in political creeds – Jacobinism, nationalism and varieties of totalitarianism – that were partly religious in nature. Something similar is happening in the Middle East. Fuelled by movements that combine radical fundamentalism with elements borrowed from secular ideologies such as Leninism and fascism, conflict between Shia and Sunni communities looks set to continue for generations to come. Even if Isis is defeated, it will not be the last movement of its kind. Along with war, religion is not declining, but continuously mutating into hybrid forms.

Iraqi Yazidis
Iraqi Yazidis, who fled an Islamic State attack on Sinjar, gather to collect bottles of water at the Bajid Kandala camp in Kurdistan’s western Dohuk province. Photograph: Ahmad al-Rubaye/AFP/Getty Images

Western intervention in the Middle East has been guided by a view of the world that itself has some of the functions of religion. There is no factual basis for thinking that something like the democratic nation-state provides a model on which the region could be remade. States of this kind emerged in modern Europe, after much bloodshed, but their future is far from assured and they are not the goal or end-point of modern political development. From an empirical viewpoint, any endpoint can only be an act of faith. All that can be observed is a succession of political experiments whose outcomes are highly contingent. Launched in circumstances in which states constructed under the aegis of western colonialism have broken down under the impact of more recent western intervention, the gruesome tyranny established by Isis will go down in history as one of these experiments.
The weakness of faith-based liberalism is that it contains nothing that helps in the choices that must be made between different kinds and degrees of evil. Given the west’s role in bringing about the anarchy in which the Yazidis, the Kurds and other communities face a deadly threat, non-intervention is a morally compromised option. If sufficient resources are available – something that cannot be taken for granted – military action may be justified. But it is hard to see how there can be lasting peace in territories where there is no functioning state. Our leaders have helped create a situation that their view of the world claims cannot exist: an intractable conflict in which there are no good outcomes. 

Thursday, 11 July 2013

In praise of cynicism

It's claimed that at the age of 44 our cynicism starts to grow. But being cynical isn't necessarily a bad thing, argues Julian Baggini. It's at the heart of great satire and, perhaps more importantly, leads us to question what is wrong with the world – and strive to make it better
Test how cynical you are
Bunch of cynics … Hislop, Brockovich, Bernstein, Snowden, Woodward and Tucker.
Bunch of cynics … Hislop, Brockovich, Bernstein, Snowden, Woodward and Tucker. Photograph: guardian.co.uk
If there's one thing that makes me cynical, it's optimists. They are just far too cynical about cynicism. If only they could see that cynics can be happy, constructive, even fun to hang out with, they might learn a thing or two.
Perhaps this is because I'm 44, which, according to a new survey, is the age at which cynicism starts to rise. But this survey itself merely illustrates the importance of being cynical. The cynic, after all, is inclined to question people's motives and assume that they are acting self-servingly unless proven otherwise. Which is just as well, as it turns out the "study" in question is just another bit of corporate PR to promote a brand whose pseudo-scientific stunt I won't reward by naming. Once again, cynicism proves its worth as one of our best defences against spin and manipulation.
I often feel that "cynical" is a term of abuse hurled at people who are judged to be insufficiently "positive" by those who believe that negativity is the real cause of almost all the world's ills. This allows them to breezily sweep aside sceptical doubts without having to go to the bother of checking if they are well-grounded. In this way, for example, Edward Snowden's leaks about the CIA's surveillance practices have been dismissed because they contribute to "the corrosive spread of cynicism".
In December 1999, Tony Blair hailed the hugely disappointing Millennium Dome as "a triumph of confidence over cynicism". All those legitimate concerns about the expense and vacuity of the end result were brushed off as examples of sheer, wilful negativity.
A more balanced definition of a cynic, courtesy of the trusty Oxford English Dictionary, is someone who is "distrustful or incredulous of human goodness and sincerity", sceptical of human merit, often mocking or sarcastic. Now what's not to love about that?
Of course, cynicism is neither wholly good nor bad. It's easy to see how you can be too cynical, but it's also possible to be not cynical enough. Indeed, although the word itself is now largely pejorative, you'll find almost everyone revels in a certain amount of cynicism. It's the lifeblood of the satirical comedy of the likes of Ian Hislop, Mark Steel and Jeremy Hardy. Great fictional cynics such as Malcolm Tucker are born of cynicism about politics. It can provide the impulse for the most important investigative journalism. If Bob Woodward and Carl Bernstein had been more trustful and credulous of human goodness and sincerity, they would never have broken the Watergate story.
It can provide the impulse for the most important investigative journalism. If we were all habitually trustful and credulous of human goodness and sincerity, then there would be no questioning of dubious foreign interventions, infringements of civil liberties or sharp business practices.
Perhaps the greatest slur against cynicism is that it nurtures a fatalistic pessimism, a belief that nothing can ever be improved. There are lazy forms of cynicism of which this is certainly true. But at its best, cynicism is a greater force for progress than optimism. The optimist underestimates how difficult it is to achieve real change, believing that anything is possible and it's possible now. Only by confronting head-on the reality that all progress is going to be obstructed by vested interests and corrupted by human venality can we create realistic programmes that actually have a chance of success. Progress is more of a challenge for the cynic but also more important and urgent, since for the optimist things aren't that bad and are bound to get better anyway.
This highlights the importance of distinguishing between thinking cynically and acting cynically. There is nothing good to be said for people who cynically deceive to further their own goals and get ahead of others. But that is not what a good cynic inevitably does. Whatever you make of Snowden, whistleblowers and campaigners such as Karen Silkwood and Erin Brockovich are both cynical about what they see and idealistic about what they can do about it. For many years, I too have tried to make sure that the cynicism in my outlook does not lead to cynicism in my behaviour.
That's not the only way in which a proper cynicism challenges the simplistic black-and-white of received opinion. The cynic would surely question the way in which the world is divided into optimists and pessimists. Optimism has various dimensions, and just because some people take a dim view of human nature and some future probabilities, that does not mean they are hardcore pessimists who believe things can only get worse. Cynics refuse to be typecast as Jeremiahs. They are realists who know that the world is not the sun-kissed fantasy peddled by positive-thinking gurus and shysters.
Indeed, the greatest irony of all is that many of the people promoting optimism are unwittingly feeding a view of human nature that is cynical in the very worst sense. Take psychologist and neuroscientist Elaine Fox, who is on Horizon tonight talking about her book Rainy Brain, Sunny Brain. Like many, she traces our tendency to make positive or negative judgments back to our brains and the ways in which they have been cast by our DNA and shaped by our experience. Her upbeat conclusion is that by understanding the neural basis of personality and mood, we can change it and so increase our optimism, health and happiness.
The deeply cynical result of this apparently cheerful viewpoint is that it encourages us to see what we think and believe as products of brain chemistry, rather than as rational responses to the world as it is. Rather than focus on our reasons for being optimistic or pessimistic about, say, the environment, we focus instead on what in our brains is causing us to be optimistic or pessimistic. And that means we seek a resolution of our anxieties not by changing the world, but by changing our minds. If that's not taking a cynical view of human merit and potential, I don't know what is.
So far, I have avoided the easiest way to defend cynicism, which is to point to its illustrious pedigree in the ancient Hellenic school of philosophy from which it gets its name. But I would be cynical about that too. Words change their meanings, and so you cannot dignify the cynicism of now by associating it with its distant ancestor.
Nonetheless, there are lessons for modern cynicism from the likes of Diogenes and Crates. What they show is that a proper cynicism is not a matter of personality but intellectual attitude. Their goal was to blow away the fog and confusion and see reality with lucidity and clarity. The contemporary cynic desires the same. The questioning and doubt is not an end in itself but a means of cutting through the crap and seeing things as they really are.
The ancient Cynics also advocated asceticism and self-sufficiency. There is something of this too in their modern-day counterparts, who are aware that we waste too much of our time and money on things we don't need, but that others require us to buy to make them rich. People who live rigorously by this cynicism are often seen as grumpy killjoys. To be light and joyful today means spending freely, without guilt, on whatever looks as if it will bring us pleasure. That merely shows how deeply our desires have been infected by the power of markets. It is the cynic who actually lives more lightly, unburdened by the pressure to always have more, not relying on purchases to provide happiness and contentment.
Finally, the Cynics were notorious for rejecting all social norms. Diogenes is said to have masturbated in public, while Crates lived on the streets, with only a tattered cloak. Whether anyone is advised to follow these specific examples is questionable, but it is surely true that we do not see enough challenging of tired conventions today. Isn't it astonishing, for example, how, once elected, MPs continue the daft traditions of jeering, guffawing and addressing their colleagues by ridiculous circumlocutory terms such as "the right honourable member"? It comes to something when the most controversial defiance of convention by a politician in recent years was Gordon Brown's refusal to wear a dinner jacket and bow tie. People would perhaps be less cynical about politicians if the politicians themselves would be more cynical.
Perhaps the biggest myth about cynicism is that it deepens with age. I think what really happens is that experience painfully rips away layers of scales from our eyes, and so we do indeed become more cynical about many of the things we naively accepted when younger. But the result of this is to make us see more sharply the difference between what really matters and all the dross and nonsense that clutters up life. So as cynicism about many – perhaps most – things rises, so too does our appreciation and affection for what is good and true. Cynicism leads to more tender feelings towards what is truly lovable. Similarly, doubting the reality of much-professed sincerity is a way of showing that you respect and value the rare and precious real deal.
It's time, therefore, to reclaim cynicism for the forces of light and truth. Forget about the tired old dichotomies of positive and negative, optimistic and pessimistic. We can't make things better unless we see quite how bad they are. We can't do our best unless we guard against our worst. And it's only by being distrustful that we can distinguish between the trustworthy and the unreliable. To do all this we need intelligent cynicism, which is not so much a blanket negativity, but a searchlight for the truly positive.


--------


The test: how cynical are you?

Do you think your pet only wants you for food? Or that Clare Balding is an ambitious back-stabber?

Clare Balding: self-obsessed back-stabber?
Clare Balding: self-obsessed back-stabber? Photograph: Richard Ansett
There are many ways to express one's cynicism – Diogenes the Cynic, for example, slept in a jar – but the true cynic knows he must be more cynical than anyone else, surprised by nothing but the boundless naivety of those around him. Use this handy checklist to see if you qualify: if you agree with seven or more of the following statements you may count yourself a super-cynic. Not that it means anything. I mean, who cares, right? We're all gonna die alone.
1 You believe that mankind has failed to achieve anything of interest or note since the moon landings were faked.
2 You can look upon the grinning face of George Osborne and still declare that all politicians are as bad as each other.
3 You feel the current cultural debate is missing nothing other than the widespread dissemination of your low opinion of Channel 4's output.
4 You remain convinced that they just throw all the recycling away at the other end.
5 You believe that the editor of the Daily Mail has a dangerously rose-tinted view of human nature.
6 You have a "tendency" to put "inverted commas" "around" "everything".
7 It is your firm conviction that professional wrestling is completely staged, with the outcome of every match determined beforehand, just like professional cricket and professional tennis.
8 You think your dog is only in it for the food.
9 You used to vote Tory out of naked self-interest, but it didn't work so now you don't vote.
10 You stopped watching the Olympics because you could no longer stand the sight of the ambitious, self-obsessed, back-stabbing Clare Balding.