Search This Blog

Wednesday 4 March 2015

Cricket’s great data debate: art v science

Andy Bull in The Guardian

In July 2007, after a history reckoned to stretch back almost 4,000 years, the game of draughts was finally solved. After two decades of work, a team of computer scientists at the University of Alberta finished sifting through the 500 billion, billion possible positions on the board. Their computer programme, Chinook, was now unbeatable. So long as neither player made a mistake, every game it played was guaranteed to end in a stalemate. Later that same summer, Peter Moores was appointed as head coach of the England cricket team. Moores was one of the new breed of coaches. A numbers man, and disciple of Michael Lewis’s much abused book, Moneyball. He even gave a copy to his batting coach, Andy Flower. Moores was so keen on advanced computer analysis that he used it as the sole basis for some of his decisions – the decision to recall Ryan Sidebottom to the side, for instance.

When Flower took over the team, he hired Nathan Leamon, a qualified coach and a former maths teacher, as the team’s analyst. The players nicknamed Leamon “Numbers”. He was extraordinarily meticulous. He used Hawk-Eye to draw up spreadsheets of every single ball delivered in Test cricket in the preceding five years. He ran match simulations – accurate to within 5% – to help England determine their strategies and their team selections. For the bowlers, he broke the pitch down into 20 blocks, each of them 100cm by 15cm, and told them which ones they should hit to best exploit the weaknesses Hawk-Eye had revealed in the opposing batsmen. Bowlers should aim to hit that particular block at least twice an over. Do that, Leamon told them, and they would “markedly increases the chance of success”.

England, it was said, were making better use of the computer analysis than any other team in the world. And it was working. They won the World T20, the Ashes home and away, and became, for a time, the No1 team in all three formats of the game. Leamon’s work was picked out as one of the reasons why. And yet now they’re losing, that very same approach is being singled out as one of the things they are doing wrong. You can see why. After England’s nine-wicket defeat to Sri Lanka, Eoin Morgan said “Going in at the halfway I think we got 310, probably 25 for both par, and again, stats back that up, par is 275, 280.” It was, Morgan thought, the bowlers who were to blame for the loss. They had delivered too many bad balls. He said he didn’t yet know why. “Over the next couple of days, we will get the Hawk-Eye stuff back and the proof will be in that.”

On Tuesday morning, Kevin Pietersen tweeted that England “are “too interested in stats”. He was echoing Graeme Swann’s comments from last summer. “I’ve sat in these meetings for the last five years,” Swann said. “It was a statistics-based game. There was this crazy stat where if we get 239 – this was before the fielding restrictions changed a bit so it would be more now, I assume – we will win 72% of matches. The whole game was built upon having this many runs after this many overs, this many partnerships, doing this in the middle, working at 4.5 an over.” Swann said he was left shaking his head.

Two respected players, both speaking from fresh first-hand experience, agree that England have become too reliant on computer analysis to tell them what to do. But balance that against the irritation old pros in all sports feel about big data. Just last week the great blowhard of the NBA Charles Barkley unleashed this tirade: “All these guys who run organisations who talk about analytics, they all have one thing in common – they’re a bunch of guys who have never played the game, and they never got the girls in high school, and they just want to get in the game.” Analytics, Barkley added, were “just some crap that some people who were really smart made up to try and get in the game”.

Barkley was shot down in flames. As Bryan Curtis summed it up in his wrap over on Grantland, commentators argued that Barkley’s rant was “unintelligible” and “wholly useless”, that he was a “dinosaur” who “didn’t even realise that the war is over”, and that “the nerds make the decisions”. In England though, where we’ve been slower to adopt analytics, the consensus seems to be that Swann and Pietersen are on to something. England’s over-reliance on the numbers has become a theme in the coverage of the team, particularly among ex-players. You can hear it when they bemoan, among other things, England’s reluctance to bowl yorkers at the stumps. That’s a tactic that has worked for years, one that has been honed by hard experience. But England’s analysis has told them that slow bouncers and full balls sent wide of off-stump are harder to score off.

The thing is, in an age when all teams are using computer analysis, a tactic isn’t good or bad because it looks that way, or because it is different to what has been done before. It is simply good if it works and bad if it doesn’t. The received wisdom is being challenged, and that’s a good thing. At the same time, cricket isn’t checkers. It can’t be solved by computer. It’s not a question of intuition versus analysis, or art v science, as David Hopps put it in a recent piece on Cricinfo. The laptop is just another tool in the box, useless unless the players understand the value of the information it provides, and no more valuable than their own ability to adapt and improvise during a match. If Swann and Pietersen are right, then England are wrong. At the same time, the lessons Leamon taught the team undoubtedly played a valuable part in their earlier success, something the sceptics seem to have forgotten.

Tuesday 3 March 2015

The economic case for legalising cannabis


The public wants it and it would be good for the economy. Why has the law not been changed?

Paul Birch in The Telegraph

Channel 4’s Drugs Live programme promises to examine what cannabis does to the brain. Many of us have already seen the clips of Jon Snow struggling after a massive dose of high strength marijuana (the equivalent of forcing a teetotaller to down a bottle of vodka and then asking him how he feels).

But beyond the effects of cannabis on the brain, isn’t it time for a wider discussion on the potential effects of safe, regulated cannabis consumption on society?

How much is cannabis worth these days? According to the Institute for Economic and Research, up to £900m could be raised annually through taxation of regulated cannabis market.

Meanwhile £361 million is currently spent every year on policing and treating users of illegally traded and consumed cannabis.

It seems a lot to spend on punishing people for an activity most of us barely believe should be a crime any more. And that’s even before one factors in the potential benefit legalisation and regulation of cannabis could have for the UK exchequer.

Then, there is the job creation potential. In Colorado, which legalised marijuana at the beginning of 2014, 10,000 now work in the marijuana industry: growing and harvesting crops, working in dispensaries, and making and selling equipment. Crime has fallen: in the first three months after legalisation in Denver, the city experienced a 14.6 per cent drop in crime and specifically violent crime is down 2.4 per cent. Assaults were down by 3.7 per cent.

This reduction led to further savings and allowing stretched police forces to concentrate on more serious issues. Meanwhile, cannabis use by young people actually decreased, an uncomfortable fact for prohibitionists who argue that legalisation would simply encourage more teens to take up cannabis.

In an age when every penny of government spending is fought for, the demonstrated potential savings and revenues at very least deserve serious investigation. Revenue raised from a regulated cannabis trade could be directed towards education on safe use of cannabis.

That’s why the next government – regardless of who it is led by, should set up a Royal Commission into drug legislation.

Why a Royal Commission? Because I firmly believe this is a way forward for our fractured politics. A non-partisan commission can help politicians take hold of an issue and look at the evidence beyond the fears of being blindsided by attacks from the other side. Parties can agree to participate, evidence can be heard, everyday people can submit and read facts, opinions and analysis: it’s a real opportunity to create the “evidence-based policy” to which every party claims they aspire.

Major party leaders are reluctant to grasp the nettle of drug legislation. It’s understandable, given the current association of drugs with criminality. Half of people in the UK think cannabis contributes to street crime. But this association is inevitable as long as cannabis itself is illegal. Only a dispassionate discussion on the merits of cannabis legalisation and regulation can break that link.

Cista is standing for election on this issue because we believe the practical evidence has reached tipping point. Legalisation and regulation of cannabis can benefit the economy, lift the burden on the criminal justice system, encourage education about healthy, informed choices, and help recreational and medicinal cannabis users to enjoy a clean, safe product without being forced to engage with the underworld. Cannabis in itself is not the problem: our current law is. And we’re all paying the price.

What scares the new atheists: The vocal fervour of today’s missionary atheism conceals a panic that religion is not only refusing to decline – but in fact flourishing

John Gray in The Guardian

In 1929, the Thinker’s Library, a series established by the Rationalist Press Association to advance secular thinking and counter the influence of religion in Britain, published an English translation of the German biologist Ernst Haeckel’s 1899 book The Riddle of the Universe. Celebrated as “the German Darwin”, Haeckel was one of the most influential public intellectuals of the late nineteenth and early twentieth century; The Riddle of the Universe sold half a million copies in Germany alone, and was translated into dozens of other languages. Hostile to Jewish and Christian traditions, Haeckel devised his own “religion of science” called Monism, which incorporated an anthropology that divided the human species into a hierarchy of racial groups. Though he died in 1919, before the Nazi Party had been founded, his ideas, and widespread influence in Germany, unquestionably helped to create an intellectual climate in which policies of racial slavery and genocide were able to claim a basis in science.

The Thinker’s Library also featured works by Julian Huxley, grandson of TH Huxley, the Victorian biologist who was known as “Darwin’s bulldog” for his fierce defence of evolutionary theory. A proponent of “evolutionary humanism”, which he described as “religion without revelation”, Julian Huxley shared some of Haeckel’s views, including advocacy of eugenics. In 1931, Huxley wrote that there was “a certain amount of evidence that the negro is an earlier product of human evolution than the Mongolian or the European, and as such might be expected to have advanced less, both in body and mind”. Statements of this kind were then commonplace: there were many in the secular intelligentsia – including HG Wells, also a contributor to the Thinker’s Library – who looked forward to a time when “backward” peoples would be remade in a western mould or else vanish from the world.

But by the late 1930s, these views were becoming suspect: already in 1935, Huxley admitted that the concept of race was “hardly definable in scientific terms”. While he never renounced eugenics, little was heard from him on the subject after the second world war. The science that pronounced western people superior was bogus – but what shifted Huxley’s views wasn’t any scientific revelation: it was the rise of Nazism, which revealed what had been done under the aegis of Haeckel-style racism.

It has often been observed that Christianity follows changing moral fashions, all the while believing that it stands apart from the world. The same might be said, with more justice, of the prevalent version of atheism. If an earlier generation of unbelievers shared the racial prejudices of their time and elevated them to the status of scientific truths, evangelical atheists do the same with the liberal values to which western societies subscribe today – while looking with contempt upon “backward” cultures that have not abandoned religion. The racial theories promoted by atheists in the past have been consigned to the memory hole – and today’s most influential atheists would no more endorse racist biology than they would be seen following the guidance of an astrologer. But they have not renounced the conviction that human values must be based in science; now it is liberal values which receive that accolade. There are disputes, sometimes bitter, over how to define and interpret those values, but their supremacy is hardly ever questioned. For 21st century atheist missionaries, being liberal and scientific in outlook are one and the same.

It’s a reassuringly simple equation. In fact there are no reliable connections – whether in logic or history – between atheism, science and liberal values. When organised as a movement and backed by the power of the state, atheist ideologies have been an integral part of despotic regimes that also claimed to be based in science, such as the former Soviet Union. Many rival moralities and political systems – most of them, to date, illiberal – have attempted to assert a basis in science. All have been fraudulent and ephemeral. Yet the attempt continues in atheist movements today, which claim that liberal values can be scientifically validated and are therefore humanly universal.

Fortunately, this type of atheism isn’t the only one that has ever existed. There have been many modern atheisms, some of them more cogent and more intellectually liberating than the type that makes so much noise today. Campaigning atheism is a missionary enterprise, aiming to convert humankind to a particular version of unbelief; but not all atheists have been interested in propagating a new gospel, and some have been friendly to traditional faiths.

Evangelical atheists today view liberal values as part of an emerging global civilisation; but not all atheists, even when they have been committed liberals, have shared this comforting conviction. Atheism comes in many irreducibly different forms, among which the variety being promoted at the present time looks strikingly banal and parochial.

In itself, atheism is an entirely negative position. In pagan Rome, “atheist” (from the Greek atheos) meant anyone who refused to worship the established pantheon of deities. The term was applied to Christians, who not only refused to worship the gods of the pantheon but demanded exclusive worship of their own god. Many non-western religions contain no conception of a creator-god – Buddhism and Taoism, in some of their forms, are atheist religions of this kind – and many religions have had no interest in proselytising. In modern western contexts, however, atheism and rejection of monotheism are practically interchangeable. Roughly speaking, an atheist is anyone who has no use for the concept of God – the idea of a divine mind, which has created humankind and embodies in a perfect form the values that human beings cherish and strive to realise. Many who are atheists in this sense (including myself) regard the evangelical atheism that has emerged over the past few decades with bemusement. Why make a fuss over an idea that has no sense for you? There are untold multitudes who have no interest in waging war on beliefs that mean nothing to them. Throughout history, many have been happy to live their lives without bothering about ultimate questions. This sort of atheism is one of the perennial responses to the experience of being human.

As an organised movement, atheism is never non-committal in this way. It always goes with an alternative belief-system – typically, a set of ideas that serves to show the modern west is the high point of human development. In Europe from the late 19th century until the second world war, this was a version of evolutionary theory that marked out western peoples as being the most highly evolved. Around the time Haeckel was promoting his racial theories, a different theory of western superiority was developed by Marx. While condemning liberal societies and prophesying their doom, Marx viewed them as the high point of human development to date. (This is why he praised British colonialism in India as an essentially progressive development.) If Marx had serious reservations about Darwinism – and he did – it was because Darwin’s theory did not frame evolution as a progressive process.

The predominant varieties of atheist thinking, in the 19th and early 20th centuries, aimed to show that the secular west is the model for a universal civilisation. The missionary atheism of the present time is a replay of this theme; but the west is in retreat today, and beneath the fervour with which this atheism assaults religion there is an unmistakable mood of fear and anxiety. To a significant extent, the new atheism is the expression of a liberal moral panic.


FacebookTwitterPinterest Illustration by Christoph Hitz

Sam Harris, the American neuroscientist and author of The End of Faith: Religion, Terror and the Future of Reason (2004) and The Moral Landscape: How Science Can Determine Moral Values (2010), who was arguably the first of the “new atheists”, illustrates this point. Following many earlier atheist ideologues, he wants a “scientific morality”; but whereas earlier exponents of this sort of atheism used science to prop up values everyone would now agree were illiberal, Harris takes for granted that what he calls a “science of good and evil” cannot be other than liberal in content. (Not everyone will agree with Harris’s account of liberal values, which appears to sanction the practice of torture: “Given what many believe are the exigencies of our war on terrorism,” he wrote in 2004, “the practice of torture, in certain circumstances, would seem to be not only permissible but necessary.”)

Harris’s militancy in asserting these values seems to be largely a reaction to Islamist terrorism. For secular liberals of his generation, the shock of the 11 September attacks went beyond the atrocious loss of life they entailed. The effect of the attacks was to place a question mark over the belief that their values were spreading – slowly, and at times fitfully, but in the long run irresistibly – throughout the world. As society became ever more reliant on science, they had assumed, religion would inexorably decline. No doubt the process would be bumpy, and pockets of irrationality would linger on the margins of modern life; but religion would dwindle away as a factor in human conflict. The road would be long and winding. But the grand march of secular reason would continue, with more and more societies joining the modern west in marginalising religion. Someday, religious belief would be no more important than personal hobbies or ethnic cuisines.

Today, it’s clear that no grand march is under way. The rise of violent jihadism is only the most obvious example of a rejection of secular life. Jihadist thinking comes in numerous varieties, mixing strands from 20th century ideologies, such as Nazism and Leninism, with elements deriving from the 18th century Wahhabist Islamic fundamentalist movement. What all Islamist movements have in common is a categorical rejection of any secular realm. But the ongoing reversal in secularisation is not a peculiarly Islamic phenomenon.

The resurgence of religion is a worldwide development. Russian Orthodoxy is stronger than it has been for over a century, while China is the scene of a reawakening of its indigenous faiths and of underground movements that could make it the largest Christian country in the world by the end of this century. Despite tentative shifts in opinion that have been hailed as evidence it is becoming less pious, the US remains massively and pervasively religious – it’s inconceivable that a professed unbeliever could become president, for example.

For secular thinkers, the continuing vitality of religion calls into question the belief that history underpins their values. To be sure, there is disagreement as to the nature of these values. But pretty well all secular thinkers now take for granted that modern societies must in the end converge on some version of liberalism. Never well founded, this assumption is today clearly unreasonable. So, not for the first time, secular thinkers look to science for a foundation for their values.

It’s probably just as well that the current generation of atheists seems to know so little of the longer history of atheist movements. When they assert that science can bridge fact and value, they overlook the many incompatible value-systems that have been defended in this way. There is no more reason to think science can determine human values today than there was at the time of Haeckel or Huxley. None of the divergent values that atheists have from time to time promoted has any essential connection with atheism, or with science. How could any increase in scientific knowledge validate values such as human equality and personal autonomy? The source of these values is not science. In fact, as the most widely-read atheist thinker of all time argued, these quintessential liberal values have their origins in monotheism.

* * *

The new atheists rarely mention Friedrich Nietzsche, and when they do it is usually to dismiss him. This can’t be because Nietzsche’s ideas are said to have inspired the Nazi cult of racial inequality – an unlikely tale, given that the Nazis claimed their racism was based in science. The reason Nietzsche has been excluded from the mainstream of contemporary atheist thinking is that he exposed the problem atheism has with morality. It’s not that atheists can’t be moral – the subject of so many mawkish debates. The question is which morality an atheist should serve.

It’s a familiar question in continental Europe, where a number of thinkers have explored the prospects of a “difficult atheism” that doesn’t take liberal values for granted. It can’t be said that anything much has come from this effort. Georges Bataille’s postmodern project of “atheology” didn’t produce the godless religion he originally intended, or any coherent type of moral thinking. But at least Bataille, and other thinkers like him, understood that when monotheism has been left behind morality can’t go on as before. Among other things, the universal claims of liberal morality become highly questionable.


FacebookTwitterPinterest Illustration by Christoph Hitz

It’s impossible to read much contemporary polemic against religion without the impression that for the “new atheists” the world would be a better place if Jewish and Christian monotheism had never existed. If only the world wasn’t plagued by these troublesome God-botherers, they are always lamenting, liberal values would be so much more secure. Awkwardly for these atheists, Nietzsche understood that modern liberalism was a secular incarnation of these religious traditions. As a classical scholar, he recognised that a mystical Greek faith in reason had shaped the cultural matrix from which modern liberalism emerged. Some ancient Stoics defended the ideal of a cosmopolitan society; but this was based in the belief that humans share in the Logos, an immortal principle of rationality that was later absorbed into the conception of God with which we are familiar. Nietzsche was clear that the chief sources of liberalism were in Jewish and Christian theism: that is why he was so bitterly hostile to these religions. He was an atheist in large part because he rejected liberal values.

To be sure, evangelical unbelievers adamantly deny that liberalism needs any support from theism. If they are philosophers, they will wheel out their rusty intellectual equipment and assert that those who think liberalism relies on ideas and beliefs inherited from religion are guilty of a genetic fallacy. Canonical liberal thinkers such as John Locke and Immanuel Kant may have been steeped in theism; but ideas are not falsified because they originate in errors. The far-reaching claims these thinkers have made for liberal values can be detached from their theistic beginnings; a liberal morality that applies to all human beings can be formulated without any mention of religion. Or so we are continually being told. The trouble is that it’s hard to make any sense of the idea of a universal morality without invoking an understanding of what it is to be human that has been borrowed from theism. The belief that the human species is a moral agent struggling to realise its inherent possibilities – the narrative of redemption that sustains secular humanists everywhere – is a hollowed-out version of a theistic myth. The idea that the human species is striving to achieve any purpose or goal – a universal state of freedom or justice, say – presupposes a pre-Darwinian, teleological way of thinking that has no place in science. Empirically speaking, there is no such collective human agent, only different human beings with conflicting goals and values. If you think of morality in scientific terms, as part of the behaviour of the human animal, you find that humans don’t live according to iterations of a single universal code. Instead, they have fashioned many ways of life. A plurality of moralities is as natural for the human animal as the variety of languages.

At this point, the dread spectre of relativism tends to be raised. Doesn’t talk of plural moralities mean there can be no truth in ethics? Well, anyone who wants their values secured by something beyond the capricious human world had better join an old-fashioned religion. If you set aside any view of humankind that is borrowed from monotheism, you have to deal with human beings as you find them, with their perpetually warring values.

This isn’t the relativism celebrated by postmodernists, which holds that human values are merely cultural constructions. Humans are like other animals in having a definite nature, which shapes their experiences whether they like it or not. No one benefits from being tortured or persecuted on account of their religion or sexuality. Being chronically poor is rarely, if ever, a positive experience. Being at risk of violent death is bad for human beings whatever their culture. Such truisms could be multiplied. Universal human values can be understood as something like moral facts, marking out goods and evils that are generically human. Using these universal values, it may be possible to define a minimum standard of civilised life that every society should meet; but this minimum won’t be the liberal values of the present time turned into universal principles.

Universal values don’t add up to a universal morality. Such values are very often conflicting, and different societies resolve these conflicts in divergent ways. The Ottoman empire, during some of its history, was a haven of toleration for religious communities who were persecuted in Europe; but this pluralism did not extend to enabling individuals to move from one community to another, or to form new communities of choice, as would be required by a liberal ideal of personal autonomy. The Hapsburg empire was based on rejecting the liberal principle of national self-determination; but – possibly for that very reason – it was more protective of minorities than most of the states that succeeded it. Protecting universal values without honouring what are now seen as core liberal ideals, these archaic imperial regimes were more civilised than a great many states that exist today.

For many, regimes of this kind are imperfect examples of what all human beings secretly want – a world in which no one is unfree. The conviction that tyranny and persecution are aberrations in human affairs is at the heart of the liberal philosophy that prevails today. But this conviction is supported by faith more than evidence. Throughout history there have been large numbers who have been happy to relinquish their freedom as long as those they hate – gay people, Jews, immigrants and other minorities, for example – are deprived of freedom as well. Many have been ready to support tyranny and oppression. Billions of human beings have been hostile to liberal values, and there is no reason for thinking matters will be any different in future.

An older generation of liberal thinkers accepted this fact. As the late Stuart Hampshire put it:
“It is not only possible, but, on present evidence, probable that most conceptions of the good, and most ways of life, which are typical of commercial, liberal, industrialised societies will often seem altogether hateful to substantial minorities within these societies and even more hateful to most of the populations within traditional societies … As a liberal by philosophical conviction, I think I ought to expect to be hated, and to be found superficial and contemptible, by a large part of mankind.”

Today this a forbidden thought. How could all of humankind not want to be as we imagine ourselves to be? To suggest that large numbers hate and despise values such as toleration and personal autonomy is, for many people nowadays, an intolerable slur on the species. This is, in fact, the quintessential illusion of the ruling liberalism: the belief that all human beings are born freedom-loving and peaceful and become anything else only as a result of oppressive conditioning. But there is no hidden liberal struggling to escape from within the killers of the Islamic State and Boko Haram, any more than there was in the torturers who served the Pol Pot regime. To be sure, these are extreme cases. But in the larger sweep of history, faith-based violence and persecution, secular and religious, are hardly uncommon – and they have been widely supported. It is peaceful coexistence and the practice of toleration that are exceptional.
* * *

Considering the alternatives that are on offer, liberal societies are well worth defending. But there is no reason for thinking these societies are the beginning of a species-wide secular civilisation of the kind of which evangelical atheists dream.

In ancient Greece and Rome, religion was not separate from the rest of human activity. Christianity was less tolerant than these pagan societies, but without it the secular societies of modern times would hardly have been possible. By adopting the distinction between what is owed to Caesar and what to God, Paul and Augustine – who turned the teaching of Jesus into a universal creed – opened the way for societies in which religion was no longer coextensive with life. Secular regimes come in many shapes, some liberal, others tyrannical. Some aim for a separation of church and state as in the US and France, while others – such as the Ataturkist regime that until recently ruled in Turkey – assert state control over religion. Whatever its form, a secular state is no guarantee of a secular culture. Britain has an established church, but despite that fact – or more likely because of it – religion has a smaller role in politics than in America and is less publicly divisive than it is in France.
FacebookTwitterPinterest Illustration by Christoph Hitz

There is no sign anywhere of religion fading away, but by no means all atheists have thought the disappearance of religion possible or desirable. Some of the most prominent – including the early 19th-century poet and philosopherGiacomo Leopardi, the philosopher Arthur Schopenhauer, the Austro-Hungarian philosopher and novelist Fritz Mauthner (who published a four-volume history of atheism in the early 1920s) and Sigmund Freud, to name a few – were all atheists who accepted the human value of religion. One thing these atheists had in common was a refreshing indifference to questions of belief. Mauthner – who is remembered today chiefly because of a dismissive one-line mention in Wittgenstein’s Tractatus – suggested that belief and unbelief were both expressions of a superstitious faith in language. For him, “humanity” was an apparition which melts away along with the departing Deity. Atheism was an experiment in living without taking human concepts as realities. Intriguingly, Mauthner saw parallels between this radical atheism and the tradition of negative theology in which nothing can be affirmed of God, and described the heretical medieval Christian mystic Meister Eckhart as being an atheist in this sense.

Above all, these unevangelical atheists accepted that religion is definitively human. Though not all human beings may attach great importance to them, every society contains practices that are recognisably religious. Why should religion be universal in this way? For atheist missionaries this is a decidedly awkward question. Invariably they claim to be followers of Darwin. Yet they never ask what evolutionary function this species-wide phenomenon serves. There is an irresolvable contradiction between viewing religion naturalistically – as a human adaptation to living in the world – and condemning it as a tissue of error and illusion. What if the upshot of scientific inquiry is that a need for illusion is built into in the human mind? If religions are natural for humans and give value to their lives, why spend your life trying to persuade others to give them up?

The answer that will be given is that religion is implicated in many human evils. Of course this is true. Among other things, Christianity brought with it a type of sexual repression unknown in pagan times. Other religions have their own distinctive flaws. But the fault is not with religion, any more than science is to blame for the proliferation of weapons of mass destruction or medicine and psychology for the refinement of techniques of torture. The fault is in the intractable human animal. Like religion at its worst, contemporary atheism feeds the fantasy that human life can be remade by a conversion experience – in this case, conversion to unbelief.

Evangelical atheists at the present time are missionaries for their own values. If an earlier generation promoted the racial prejudices of their time as scientific truths, ours aims to give the illusions of contemporary liberalism a similar basis in science. It’s possible to envision different varieties of atheism developing – atheisms more like those of Freud, which didn’t replace God with a flattering image of humanity. But atheisms of this kind are unlikely to be popular. More than anything else, our unbelievers seek relief from the panic that grips them when they realise their values are rejected by much of humankind. What today’s freethinkers want is freedom from doubt, and the prevailing version of atheism is well suited to give it to them.

To beat austerity, Greece must break free from the euro

Costas Lapavitsas in The Guardian
The agreement signed between Greece and the EU after three weeks of lively negotiations is a compromise reached under economic duress. Its only merit for Greece is that it has kept the Syriza government alive and able to fight another day. That day is not far off. Greece will have to negotiate a long-term financing agreement in June, and has substantial debt repayments to make in July and August. In the coming four months the government will have to get its act together to negotiate those hurdles and implement its radical programme. The European left has a stake in Greek success, if it is to beat back the forces of austerity that are currently strangling the continent.

In February the Greek negotiating team fell into a trap of two parts. The first was the reliance of Greek banks on the European Central Bank for liquidity, without which they would stop functioning. Mario Draghi, president of the European Central Bank, ratcheted up the pressure by tightening the terms of liquidity provision. Worried by developments, depositors withdrew funds; towards the end of negotiations Greek banks were losing a billion euros of liquidity a day.



Greece secures eurozone bailout extension for four months

The second was the Greek state’s need for finance to service debts and pay wages. As negotiations proceeded, funds became tighter. The EU, led by Germany, cynically waited until the pressure on Greek banks had reached fever pitch. By the evening of Friday 20 February the Syriza government had to accept a deal or face chaotic financial conditions the following week, for which it was not prepared at all.

The resulting deal has extended the loan agreement, giving Greece four months of guaranteed finance, subject to regular review by the “institutions”, ie the European Commission, the ECB and the IMF. The country was forced to declare that it will meet all obligations to its creditors “fully and timely”.

Furthermore, it will aim to achieve “appropriate” primary surpluses; desist from unilateral actions that would “negatively impact fiscal targets”; and undertake “reforms” that run counter to Syriza pledges to lower taxes, raise the minimum wage, reverse privatisations, and relieve the humanitarian crisis.

In short, the Syriza government has paid a high price to remain alive. Things will be made even harder by the parlous state of the Greek economy. Growth in 2014 was a measly 0.7%, while GDP actually contracted during the last quarter. Industrial output fell by a further 3.8% in December, and even retail sales declined by 3.7%, despite Christmas. The most worrying indication, however, is the fall in prices by 2.8% in January. This is an economy in a deflationary spiral with little or no drive left to it. Against this background, insisting on austerity and primary balances is vindictive madness.

The coming four months will be a period of constant struggle for Syriza. There is little doubt that the government will face major difficulties in passing the April review conducted by the “institutions” to secure the release of much-needed funds. Indeed, so grave is the fiscal situation that events might unravel even faster. Tax income is collapsing, partly because the economy is frozen and partly because people are withholding payment in the expectation of relief from the extraordinary tax burden imposed over the last few years. The public purse will come under considerable strain already in March, when there are sizeable debt repayments to be made.

But even assuming that the government successfully navigates these straits, in June Greece will have to re-enter negotiations with the EU for a long-term financing agreement. The February trap is still very much there, and ready to be sprung again.

What should we as Syriza do and how could the left across Europe help? The most vital step is to realise that the strategy of hoping to achieve radical change within the institutional framework of the common currency has come to an end. The strategy has given us electoral success by promising to release the Greek people from austerity without having to endure a major falling-out with the eurozone. Unfortunately, events have shown beyond doubt that this is impossible, and it is time that we acknowledged reality.

For Syriza to avoid collapse or total surrender, we must be truly radical. Our strength lies exclusively in the tremendous popular support we still enjoy. The government should rapidly implement measures relieving working people from the tremendous pressures of the last few years: forbid house foreclosures, write off domestic debt, reconnect families to the electricity network, raise the minimum wage, stop privatisations. This is the programme we were elected on. Fiscal targets and monitoring by the “institutions” should take a back seat in our calculations, if we are to maintain our popular support.
At the same time, our government must approach the looming June negotiations with a very different frame of mind from February. The eurozone cannot be reformed and it will not become a “friendly” monetary union that supports working people. Greece must bring a full array of options to the table, and it must be prepared for extraordinary liquidity measures in the knowledge that all eventualities could be managed, if its people were ready. After all, the EU has already wrought disaster on the country.

Syriza could gain succour from the European left, but only if the left shakes off its own illusions and begins to propose sensible policies that might at last rid Europe of the absurdity that the common currency has become. There might then be a chance of properly lifting austerity across the continent. Time is indeed very short for all of us.

We’re desperate to believe in something. But bringing God into economics is risky

Eliza Filby in The Guardian

With just over two months to go until polling day, it is becoming clear that the most interesting ideas are emanating from those not seeking election. The Anglican bishops have issued a pastoral letter which, despite being mauled by leading Conservatives, legitimately aims to move the debate beyond the old market-v-state model towards a new vision, one that incorporates themes of civil society, interdependency, human dignity and the common good.

Meanwhile, leading Conservative thinkers Tim Montgomerie and Stefan Shakespeare have launched their “good right” initiative, which hopes to succeed where David Cameron has so obviously failed: to detoxify the Conservative brand. Making the Tories electable again is certainly the aim, but at its core is an even more ambitious endeavour: to re-establish the moral credibility of the free market. To this chorus of extra-parliamentary voices we might also add “blue Labour” Maurice Glasman and “red Tory” Phillip Blond and, for that matter, Russell Brand. Even if their ideas are unlikely to feature in forthcoming party manifestos, a movement is clearly afoot. This disparate group may differ on the remedy but share a diagnosis: the neoliberal revolution is politically and morally defunct. One way or another, they are all dancing on Thatcher’s grave.

But to those seeking a new moral vision for Britain, Thatcherism itself offers a cautionary tale. It was, much like now, a response to widespread disillusionment and a redundant political consensus. Like the “good right”, Conservatives in the 70s also sought to disconnect the association of collectivism with virtue and reinstate the moral integrity of the “invisible hand”. Margaret Thatcher would eventually cast herself as the shepherd leading the British people out of the dark days of decline towards the path of economic and social enlightenment. Ultimately, however, it was a story of false idols and unintended consequences – one where the mix of God, economics and single-minded vision proved to be toxic. The paradox of Thatcherism is that, like all political ideologies, there was a complete discrepancy between its aims and outcomes.

“Economics is the method; the object is to change the soul,” Margaret Thatcher declared in 1981, revealing the way in which Thatcherism for her was always about transforming values rather than simply GDP. A strong religious basis to her outlook stemmed from her father – the greengrocer, councillor and Wesleyan lay preacher, Alf Roberts.

If we were sourcing the origins of Thatcherism, we wouldn’t find it in the pages of Hayek’s Road to Serfdom or Milton Friedman’s monetarist theory but in Roberts’ sermon notes, now housed in Thatcher’s personal archive at Churchill College, Cambridge. Contained in them is the theological basis of Thatcherism: an individualistic interpretation of the Bible, a nod to the spiritual dangers of avarice, the Protestant work ethic, praise of the godly virtues of thrift and self-reliance and, finally, a divine justification for individual liberty and the free market. In short, Thatcherism always owed more to Methodism than to monetarism.

Thatcher herself had been a preacher before she entered politics, and even though she transferred this missionary energy from pulpit to podium, her religious values remained crucial. On becoming Conservative leader, she saw it as her chief mission to discredit the assumed moral superiority of socialism and reconnect the broken link between Protestant and capitalist values in Britain. Preaching from the pulpit on several occasions – most famously to the Church of Scotland’s General Assembly in 1988 – Thatcher unashamedly asserted the Biblical case for the sovereignty of individual liberty and the ‘invisible hand’. Thatcher’s pledge, of course, was that greater wealth would not encourage selfishness but neighbourliness. With more money in our pocket and less dependency on the state, we would be free to exercise our moral virtue and perform our duty as Good Samaritans.

We would not walk by on the other side, nor would we need state-imposed traffic lights to guide us there.

In the end, though, even she was prepared to admit she had failed in her crusade. When asked by Frank Field what her greatest regret in office was, she replied: “I cut taxes and I thought we would get a giving society, and we haven’t.” She was right. A survey conducted by the Charities Aid Foundation in 1989 revealed that those who gave the most to causes were not from the prosperous south but were disproportionately located in those areas that benefited least from the Thatcher boom.



FacebookTwitterPinterest Thatcher’s naivety was perhaps her greatest flaw.Photograph: ITV/Rex/ITV/Rex

Thatcher’s naivety was perhaps her greatest flaw: her understanding of capitalism for example was more a provincial than global one; Alf Roberts behind the counter of his grocery shop rather than the yuppie on the trading floor was the image of market transaction in her mind. It is little wonder then that she could not understand the world she had created, where the nation’s homes and household budgets were entwined with a global financial services sector that made up an ever-growing percentage of Britain’s GDP, largely internationally owned and in the hands of speculators concerned with short-term gain and distant from the deals and lives they were gambling on. In private Thatcher used to rage against bankers and their bonuses. Why did they not follow the example of those in the army she would cry, which in her view was the model demonstration of responsibility to one’s fellow man.

As someone reared in a home where profligacy was a vice and thrift a virtue, nor could Thatcher fathom why so many Britons struggled with debt. Yet paradoxically it was her government that did most to encourage it. What might be termed the “democratisation of debt”, be it in the form of credit and store cards, personal loans and of course, mortgages, fundamentally reordered the nation’s psyche and our attitudes towards money and the state. In short, we transferred our dependency from the government to the money-lenders. The notion of deferred gratification or thrift, that is saving for something before consuming it, became an alien concept for Britain’s “grab now, pay later” society. Total consumer credit more than doubled, while the number of credit cards nearly tripled in the 1980s and would spiral to unimaginable levels over the next two decades. This culture of credit too trickled down the social scale for as the government squeezed the benefits system so those low-income households turned to credit companies who asked few questions. In 1980 22% of households were using credit; by 1989 that had trebled to 69%, with an estimated 50% of those loans going on essentials. As the New Economics Foundation report of 2002 into debt recognised this led to the absurd situation whereby “what the taxpayer was providing in terms of benefits, the lender was often taking away – with interest”. It is doubtful that even Thatcher considered Britain’s record personal debt as part of her plan of “setting the people free”.

Thatcherism laid the foundations for a culture in which individualism and self-reliance could thrive, but ultimately it created a culture in which only selfishness and excess were rewarded. Thatcher liked to quote John Wesley’s mantra, “Earn all you can, save all you can and give all you can,” and yet it was only ever the first instruction that was sufficiently encouraged. While Cameron and Osborne have spoken at length about paying off the ‘nation’s credit card’, they have consciously avoided entreating individuals to pay off their own. Tellingly, it is now a vote-winner to talk of governmental thrift but political suicide to talk of personal thrift. That is the true legacy of Thatcherite economics.

When Thatcher said that there was ‘no such thing as society', it was a rallying cry for individual moral responsibility

When Thatcher uttered those now immortal words that there was “no such thing as society”, it was not a negative or flippant statement but a naive rallying cry for individual moral responsibility. Perhaps the flaw in her thinking was not that she did not believe in society but that she had too much faith in man.

Thatcher seemed to have forgotten the key doctrine in both Conservative philosophy and the Bible: the Fall. Thatcherism was a challenge to individual moral virtue, yet in Thatcher’s Eden, when given the choice, we – of course – ate the fruit. Where critics tend to go wrong in their assessment of Thatcher is that they do not consider that there was any moral, only economic, thinking behind it; where Thatcher’s admirers go wrong is that they do not admit that was a fundamental discrepancy between her aims and outcomes.

It is, of course, wrong to heap all the blame on Thatcher. This culture was encouraged and this behaviour continued unabated under New Labour. Much like a gangster’s wife who enjoys the lifestyle but does not question how her husband gets his money, Blair and Brown were content to pocket a significant share of the profits to fund their schools and hospitals.

By 2008 the world seemed on the precipice of something fundamental, but one of the remarkable features of the last seven years is how little has changed. Perhaps Thatcher’s great mistake was that, as Alfred Sherman said, “she saw life in primary colours”.

So there is credibility and value in dreaming up an alternative where Thatcher insisted that there was none. Given the contemporary disillusionment with capitalism, voters are still in desperate need of something to believe in. What the neoliberal experiment of the last 30 years teaches us is not that religion and politics do not mix, but that the politics of certainty is where danger lies.

Love Jihad - 'Attractive jihadists can lure UK girls to extremism'

Ref BBC 3 March 2015
"Attractive" jihadist fighters can be "eye candy" to lure in British Muslim girls, a former extremist has said.
Ayesha - a false name to protect her identity - told BBC Newsnight she was taught to see the UK as "our enemy".
She now rejects that ideology, but said her ex-allies would regard the militant known as "Jihadi John" as an "idol".
Three schoolgirls recently left the UK, apparently to join militants in Syria - leading to questions over why British girls would make that choice.
Ayesha, from the Midlands, is now in her early 20s and said she was first contacted by extremists when she was a student aged 16 or 17.
She said a man sent her a Facebook message saying she was "very attractive" and telling her: "Now's the time to cover that beauty because you're so precious."
Ayesha said the message was "bordering on harassment" but it was the "best way I could have been targeted" because it played on her religious beliefs and told her she would "end up in hell" if she did not obey.
'Exciting'
And she said there was glamour as well as fear in what she saw.
"As a teenager I wanted to get my piece of eye candy and I'd take a good look, and all the YouTube videos, for some reason, they [the militants] were all really, really attractive.
"It was glamorous in the sense it was like 'oh wow, I can get someone who practises the same religion as me, who's not necessarily from my ethnicity and that's exciting'."
Al-Shabab fighters training in SomaliaAyesha was attracted by groups including Somalian militants al-Shabab
She added: "It was like, get with him before he dies.
"And then when he dies as a martyr you'll join him in heaven."
Ayesha was radicalised before the rise of Islamic State (IS), which has taken control of parts of Iraq and Syria, and was attracted by al-Qaeda and al-Shabab.
'Don't trust Britain'
"In some of the sermons we were encouraged that we shouldn't identify ourselves as British," she said.
Ayesha said she was told to view Britain as a "kuffar [non-Muslim] nation" that had killed many Muslims and was "our enemy".
"You don't trust the state, you don't trust the police, you don't send your children to state schools," she said.
She said she was told to view British women as "disgusting" and "practically like men".
But Ayesha said she eventually rejected these ideas.
She said the two main things which drove her away from the ideology was that it did "no justice to women" and it said followers "have to go and kill someone that's non-Muslim".
Ayesha said her old associates would praise Mohammed Emwazi - known as "Jihadi John" - the British IS militant who has apparently featured in videos showing the beheading of several Western hostages.
"They'd definitely consider him a role model," she said.
"He is someone they would be really proud of."

Sunday 1 March 2015

14 Things To Know Before You Start Meditating

Sasha Bronner in The Huffington Post

New to meditating? It can be confusing. Not new to meditating? It can still be confusing.
The practice of meditation is said to have been around for thousands of years -- and yet, in the last few, especially in America, it seems that everyone knows at least one person who has taken on the ancient art of de-stressing.
Because it has been around for so long and because there are many different types of meditation, there are some essential truths you should know before you too take the dive into meditation or mindfulness (or both). Take a look at the suggestions below.
1. You don't need a mantra (but you can have one if you want). 
mantra
It has become common for people to confuse mantra with the idea of an intention or specific words to live by. A motto. But the actual word "mantra" means something quite different. Man means mind and tra means vehicle. A mantra is a mind-vehicle. Mantras can be used in meditation as a tool to help your mind enter (or stay in) your meditation practice.
Other types of meditation use things like sound, counting breaths or even just the breath itself as a similar tool. Another way to think about a mantra is like an anchor. It anchors your mind as you meditate and can be what you come back to when your thoughts (inevitably) wander.
2. Don’t expect your brain to go blank.
One of the biggest misconceptions about meditation is that your mind is supposed to go blank and that you reach a super-Zen state of consciousness. This is typically not true. It's important to keep in mind that you don’t have to try to clear thoughts from your brain during meditation.
The "nature of the mind to move from one thought to another is in fact the very basis of meditation," says Deepak Chopra, a meditation expert and founder of the Chopra Center for Wellbeing. "We don’t eliminate the tendency of the mind to jump from one thought to another. That’s not possible anyway." Depending on the type of meditation you learn, there are tools for gently bringing your focus back to your meditation practice. Alternatively, some types of meditation actually emphasize being present and mindful to thoughts as they arise as part of the practice.
3. You do not have to sit cross-legged or hold you hands in any position. 
meditation man
You can sit in any position that is comfortable to you. Most people sit upright in a chair or on a cushion. Your hands can fall gently in your lap or at your sides. It is best not to lie down unless you’re doing a body scan meditation or meditation for sleep.
4. Having said that, it’s also okay if you do fall asleep. 
It’s very common to doze off during meditation and some believe that the brief sleep you get is actually very restorative. It’s not the goal, but if it’s a byproduct of your meditation, that is OK. Other practices might give tricks on how to stay more alert if you fall asleep (check out No. 19 on these tips from Headspace), like sitting upright in a chair. In our experience, the relaxation that can come from meditation is a wonderful thing -- and if that means a mini-snooze, so be it.
5. There are many ways to learn.
With meditation becoming so available to the masses, you can learn how to meditate alone, in a group, on a retreat, with your phone or even by listening to guided meditations online. Everyone has a different learning style and there are plenty of options out there to fit individual needs. Read our suggestions for how to get started.
6. You can meditate for a distinct purpose or for general wellness.
meditate
Some meditation exercises are aimed at one goal, like helping to ease anxiety or helping people who have trouble sleeping. One popular mindfulness meditation technique, loving-kindness meditation, promotes the positive act of wishing ourselves or others happiness. However, if you don't have a specific goal in mind, you can still reap the benefits of the practice.
8. It can also physically change your brain.
Researchers have not only looked at the brains of meditators and non-meditators to study the differences, but they have also started looking at a group of brains before and after eight weeks of mindfulness meditation. The results are remarkable. Scientists noted everything from "changes in grey matter volume to reduced activity in the 'me' centers of the brain to enhanced connectivity between brain regions," Forbes reported earlier this year.
Those who participated in an eight week mindfulness program also showed signs of ashrinking of the amygdala (the brain’s "fight or flight" center) as well as a thickening of the pre-frontal cortex, which handles brain functions like concentration and awareness.
Researchers also looked at brain imaging on long-term, experienced meditators. Many, when not in a state of meditation, had brain image results that looked more like the images of a regular person's brain while meditating. In other words, the experienced meditator's brain is remarkably different than the non-meditator's brain.
9. Oprah meditates.
oprah
So does Paul McCartneyJerry SeinfeldHoward SternLena DunhamBarbara WaltersArianna Huffington and Kobe Bryant. Oprah teams up with Deepak Chopra for 21-day online meditation experiences that anyone can join, anywhere. The program is free and the next one begins in March 2015.
10. It’s more mainstream than you might think.
Think meditation is still a new-age concept? Think again. GQ magazine wrote their own guide to Transcendental Meditation. Time’s February 2014 cover story wasdevoted to "the mindful revolution" and many big companies, such as Google, Apple, Nike and HBO, have started promoting meditation at work with free classes and new meditation rooms.
11. Mindfulness and meditation are not the same thing.
The two are talked about in conjunction often because one form of meditation is called mindfulness meditation. Mindfulness is defined most loosely as cultivating a present awareness in your everyday life. One way to do this is through meditation -- but not all meditation practices necessarily focus on mindfulness.
Mindfulness meditation is referred to most often when experts talk about the health benefits of meditation. Anderson Cooper recently did a special on his experience practicing mindfulness with expert Jon Kabat-Zinn for "60 Minutes."
12. Don’t believe yourself when you say you don’t have time to meditate.
crowded desk
While some formal meditation practices call for 20 minutes, twice a day, many other meditation exercises can be as short as five or 10 minutes. We easily spend that amount of time flipping through Netflix or liking things on Instagram. For some, it’s setting the morning alarm 10 minutes earlier or getting off email a few minutes before dinner to practice.
Another way to think about incorporating meditation into your daily routine is likening it to brushing your teeth. You might not do it at the exact same time each morning, but you always make sure you brush your teeth before you leave the house for the day. For those who start to see the benefits of daily meditation, it becomes a non-negotiable part of your routine.
13. You may not think you’re “doing it right” the first time you meditate.
Or the second or the third. That’s OK. It’s an exercise that you practice just like sit-ups or push-ups at the gym. You don’t expect a six-pack after one day of exercise, so think of meditation the same way.
14. Take a step back.
Many meditation teachers encourage you to assess your progress by noticing how you feel in between meditations -- not while you’re sitting down practicing one. It’s not uncommon to feel bored, distracted, frustrated or even discouraged some days while meditating. Hopefully you also have days of feeling energized, calm, happy and at peace. Instead of judging each meditation, try to think about how you feel throughout the week. Less stressed, less road rage, sleeping a little bit better? Sounds like it's working.