Search This Blog

Showing posts with label book. Show all posts
Showing posts with label book. Show all posts

Sunday 10 July 2011

Transcendental Meditation: Were the hippies right all along?


For years, it has been ridiculed as a 1960s embarrassment. Now Transcendental Meditation is back in a big way. So were those hippies on to something all along?
By Laura Tennant
Sunday, 10 July 2011 The Independent
Remember M-People's 1995 Top 10 hit instructing you to "search for the hero inside yourself"? A decade-and-a-half on, it seems that things have changed – these days, it's not so much a hero as a guru that many of us are hoping to internalise. For strange as it may sound, among those of us who seek to surf the zeitgeist, the most fashionable thinker of 2011 may turn out to be Maharishi Mahesh Yogi, the founder of the Transcendental Meditation movement – and the fact that he passed to a better place in 2008 doesn't appear to have discouraged us one bit.
TM, as its followers call it, is rapidly moving from kooky margin to respectable mainstream thanks largely to a burgeoning body of scientific research which indicates that regular meditators can expect to enjoy striking reductions in heart attack, stroke and early mortality (as much as 47 per cent, according to one study). And the apparent benefits don't stop there: according k to a pilot study just published in the US journal Military Medicine, veterans of the Iraq and Afghanistan wars showed a 50 per cent reduction in their symptoms of post-traumatic stress disorder after eight weeks of TM.
Meanwhile, educational establishments which introduce a "quiet time programme" – as did Visitacion Valley Middle School in San Francisco – report drops in fights and suspensions, increased attendance and improvements in exam results. In this country, the Maharishi School in Ormskirk, Lancashire, gets glowing reports from Ofsted and achieves exceptional academic results.
An estimated four million people now practise TM globally – 20 minutes twice daily, as per the Maharishi's prescription – many of them over the course of many decades, and there are some famous, and rather surprising, names on the list. Clint Eastwood, for example, has been doing it for 40 years, a fact he vouchsafed via video link at a fund-raising dinner for the David Lynch Foundation, an organisation set up by the film-maker to teach TM to school children, soldiers suffering post-traumatic stress, the homeless and convicted prisoners. Other celebrity adherents include Paul McCartney, Russell Brand, Martin Scorsese, Ringo Starr, Mary Tyler Moore, Laura Dern and Moby.
TM reaches far into the rational and sceptical world, too; the American philosopher Daniel Dennett does it, as does Dr Jonathan Rowson, head of the Social Brain project at the Royal Society for the Encouragement of Arts, Manufactures and Commerce (RSA) and a chess grandmaster (more from them later). Now a psychiatrist with 30 years' clinical experience, Dr Norman Rosenthal has written a book, Transcendence: Healing and Transformation through Transcendental Meditation, which gathers all the available evidence for TM and urges healthcare professionals to offer it to patients suffering from mental illnesses ranging from mild depression to bipolar disorder.
While the research on the health benefits of TM is fascinating, there's another, more compelling, reason why meditation is in the air just now. Done consistently, it seems to offer some sort of corrective to modernity, a respite from anxiety and the ability to really, truly relax, without chemical assistance; a break from our constant, restless and often doomed aspirations to be thinner, richer and more popular on Facebook; the welcome discovery that happiness is to be found not in retail therapy, but within.
Those spiritual cravings explain why Rosenthal's book is now riding high at number 14 on America's Publishers Weekly non-fiction list. And according to TM UK's official representative, David Hughes, there's a similar surge of interest on this side of the Atlantic; figures are vague, but he reports that "there's definitely an ongoing increase month by month" to the estimated 200,000 people who have learnt TM in the UK since 1960.
I first began to ponder the notion of meditation while writing a piece on solitude. While aloneness might not be a state that comes naturally to most humans, without it, mental-health experts believe, it is impossible to be creative or even really to know oneself. It was the sheerest coincidence that on the day I contacted TM's UK website they were preparing for Dr Rosenthal's press conference.
My own adventures in TM began soon after – but first, a little history for readers too young to remember TM's 1960s "first wave". Many of those who do recall the arrival of the Maharishi Mahesh Yogi in Britain in 1967 understandably feel that TM has been discredited beyond hope of rehabilitation by years of embarrassing rumours and implausible claims. Long before his death, the Maharishi's leadership of the movement had been associated with an unseemly desire to cash in on his celebrity followers – including, most famously, The Beatles (as well as McCartney, George Harrison continued to meditate every day until he died) – and the accumulation of a substantial personal fortune (in 1998, the movement's property assets were valued at $3.5bn). Sexual impropriety was also alleged; The Beatles were said to have fallen out with the Maharishi at least partly because of his attempted seduction of Mia Farrow, or possibly her sister Prudence, at his ashram in India.
Generations of Oxford undergraduates have joked about nearby Mentmore Towers, the Buckinghamshire mansion where the Maharishi installed 100 young men in 1979 to practise continuous, advanced-level TM (they've since been retired). The inherently comical idea of yogic flying (actually yogic hopping) has always strained credibility, as has the Maharishi's claim that if 1 per cent of the globe's population practised TM, the flow of "good vibrations" would bring about a universal state of "bliss consciousness".
Then there was the Natural Law Party, the "political arm" of the TM movement, extant from 1993 to 1999 and set up, according to David Hughes, to "get the message across" about TM and also, bizarrely, the dangers of GM food. The party was a resounding flop – testament, perhaps, to the British mistrust of mysticism and religiosity in politics.
TM also infuriates many militant atheists in a way that "mindfulness meditation", which draws on the Buddhist tradition, does not. Sam Harris is a neuroscientist and the author of books including The End of Faith and The Moral Landscape and a blog, On Spiritual Truths. In a recent piece for The Huffington Post entitled "How to Meditate", he remarks that: "Even an organisation like Transcendental Meditation, which has spent decades self-consciously adapting itself for use by non-Hindus, can't overcome the fact that its students must be given a Sanskrit mantra as the foundation of the practice. Ancient incantations present an impediment to many a discerning mind (as does the fact that TM displays several, odious signs of being a cult)."
Against these objections should be set the fact that people who start meditating tend to keep at it, often for the rest of their lives – a phenomenon suggesting that its benefits, while slow and cumulative, are palpable. The aforementioned Dr Rowson, who was British chess champion from 2004 to 2006, has been practising TM for 14 years. "I'd say that TM is physiologically very powerful, and spiritually a bit shallow," he says. "There are few things better for giving you a feeling of serenity, energy and balance. But I don't think it gives you any particular insight into your own mind."
It seems that scientific research backs his experience. The bestselling Dr Rosenthal came to public prominence through his work on seasonal affective disorder at the National Institute of Mental Health in Maryland, where he also pioneered the use of light therapy to treat it. His interest in TM was piqued when one of his bipolar patients described how practising TM alongside his regular medication had helped him move from "keeping his head above water" to feeling "really happy 90 per cent of the time".
Dr Rosenthal began to examine the large body of scientific research into the effects of TM on long-term users, and also to collect anecdotal evidence from meditators. His book Transcendence is the result, though as he acknowledges in his introduction, "Some of you may find this preview of the benefits of TM – this seemingly simple technique – exaggerated and hard to believe. I don't blame you." He draws on 340 peer-reviewed research articles to back his argument that TM can not only reduce the incidence of cardiovascular disease, but also assist in treating addiction, post-traumatic stress disorder, ADHD and depression, not to mention helping high-functioning individuals achieve greater "self-actualisation".
Listening to Rosenthal talk, I was impressed by his medical experience and academic credentials. Yet TM's ability to reduce one's risk of heart disease interested me less than its effects on mental wellbeing and creativity. Maslow's famous hierarchy of needs described "self-actualisation" as the thing humans seek when their six basic needs for food, safety, physical shelter, love, sex and a sense of belonging have been met. Like many other evolved and somewhat spoilt beneficiaries of the affluent West, I too wanted to self-actualise, and I hoped TM could help me do it.
Acquiring the skill isn't difficult, but it does require time and money. Fees are charged on a sliding scale according to income – courses start at £190 for children and rise to £590. Initiates attend four sessions, and are given a Sanskrit mantra, which is repeated soundlessly in one's head while meditating. The objective, according to TM's website, is that "the mind effortlessly transcends mental activity and experiences pure consciousness at the source of thought, while the body experiences a unique state of restfulness".
The first thing I noticed was that repeating the "sound vibration" of my mantra took me to a place which was neither wakefulness, sleeping nor dreaming. Over the course of subsequent sessions I've regularly become detached from my physical self and dipped in and out of this "fourth state" of consciousness. Allowing sometimes painful thoughts and feelings to come to the surface has bought tears to my eyes, but I've also reached important decisions.
A month into my practice, I have not so far experienced "bliss", a condition beyond time and space in which one is not "ebulliently happy", as Rosenthal puts it, but "calm and alert"; a state, he explains, in which one realises that "just to be is a blessing". But I'm prepared to believe the effects are gradual and I'm struck by the fact that I no longer resent the necessary investment of time.
The effectiveness of this daily "yoga for the mind", as the meditator and fashion designer Amy Molyneux calls it, is the reason, I think, that thousands of people can ignore the Maharishi's theory in favour of his practice. But depending on your point of view, TM's spiritual aspects remain problematic. When the Maharishi School was granted "free school" status, for example, allowing it to scrap its annual £7,600 fees and receive Government funding, hackles were raised in more determinedly sceptical quarters.
Should we be concerned that a school infused with the TM philosophy is getting Government funding? To find out whether the organisation merited the accusations of "cultishness" levelled at it, I spoke to Suzanne Newcombe, a research officer for Inform, the charity run by the London School of Economics to provide information about new religious movements or "cults". "We've had a certain number of complaints from members of the public about the fee structure," she told me. "And occasionally relatives may be anxious about people who commit their lives to the movement. But we're not overly concerned about adults making decisions for themselves which don't hurt anyone else."
According to David Hughes, TM is a not-for-profit, charitable and educational foundation which, once it has paid its teachers and covered its costs, ploughs its revenue back into outreach programmes in the developing world. It is certainly not shy about proselytising; but if its impact on public health is as great as Dr Rosenthal believes, one could argue it has a moral responsibility to spread its message. As for me, I'm seriously considering introducing my children to a stress- and anxiety-busting daily ritual that seems to do no harm and may well do a great deal of good.

Saturday 9 July 2011

Fiction takes you to places that life can't

Philip Hensher in The Independent:

It takes a novelist, not a psychologist, to explain why people sometimes behave out of character
Saturday, 9 July 2011
 
What's it like to die? There's no answer to this cheerful question, or there shouldn't be.

People have told us what it's like nearly to die, to come back from the brink. The external process of death has been gone over in great detail. But no one has definitively returned from the other side, to tell us what it's like to feel the last breath leaving your body. We don't know anything about it.

Or rather, we shouldn't know anything about it. In 1886, Tolstoy published a short story called "The Death of Ivan Ilych", which follows a fairly unremarkable man to the complete extinction of life. After reading that, you feel you know what death will be like: "Suddenly some force struck him in the chest and side, making it still harder to breathe, and he fell through the hole and there at the bottom was a light. What had happened to him was like the sensation one sometimes experiences in a railway carriage when one thinks one is going backwards while one is really going forwards and suddenly becomes aware of the real direction." How could Tolstoy possibly know that? You will read any number of academic studies of the processes of death without coming near the novelist's instinctive understanding.

A wonderful Canadian academic and psychologist, Keith Oatley, has carried out some research on readers and non-readers of fiction, and has questioned this widespread assumption. Speaking to the Today programme this week, he shared his conclusion that habitual readers of novels were much better at coping with social situations and with a wide range of human beings. The usual image of the thick-lensed bookworm who can't cope with people – Philip Larkin's character who says "when getting my nose in a book/cured most things short of school" – is far from reality.

Well, all of us Dewey-botherers knew that. I guess from day one, I had a general sense that novels were going to introduce me to more sorts of people than life would. There was Mummy and Daddy and my big sister; there was Mr and Mrs Griffiths next door, and there were the Skittles at the end of the garden. On the other hand, if you opened a book, there wasDorothy and her friends the lion and the tinman and a boy called Tip, later transformed into Princess Glinda of Oz.

Later on, there were girls who went away to a super school called Malory Towers, not very much like anyone I knew; there were robots and Boy Detectives and a talking spider called Charlotte (who died) and a foul-tempered talking pudding and a larrikin koala, some rather intimidating children called Bastable and a boy called Philip Pirrip.

Whenever I hear someone say "I don't read novels – I prefer to read about the truth," I wonder about their notion of "the truth". The conviction that reading fiction is a dispensable part of a rich, full life is a widely held one. Members of my own family, to this day, will say to me if they find me engrossed in a thriller, "If you're not doing anything...".

The saddest expression of this attitude must be Quentin Crisp's famous landlady, who was always commenting on his actions. If she came across him having his lunch, she would say "Eating." If she saw him sewing a button on, she would say "Mending. Once, she found him reading a novel. She looked at him, and said "Waiting."

I don't suppose any reader complains for a moment that his life is failing to introduce him to as interesting a collection of people as he will find in 10 minutes in the nearest bookshop. On the other hand, real life has a way of intruding itself. You can't live your life entirely within the pages of a novel, as much as some of us attempt to. And when real life starts to expand beyond the small domestic circle, then your reading of novels is going to prepare you for what life can hold. India is not completely strange if you have read Narayan; nor is old age after Elizabeth Taylor's Mrs Palfrey at the Claremont.

Fiction won't tell you the whole story, but it will take you to places that life won't – Sicilian ducal houses, 13th-century convents, cities in Calvino that never existed. And sometimes with a shock of recognition, you meet in real life a friend from a book. I have a dear old German friend who, the very first time I met him, I thought "Snufkin". He really was Tove Jansson's charismatic, silent, solitary wanderer to the life. I wouldn't have known what to make of him without those magical novels.

How do novelists do it? They throw themselves into lives very unlike their own; their imaginative reconstructions are as apt to be as convincing as reports back from experience. Tolstoy knows what it is like to die; Stephen Crane tells us what war is like in The Red Badge of Courage, only experiencing battle after writing it. Conrad undoubtedly knew what it was like to endure a stupendous tropical storm. Thousands of sailors went through events like the ones described in Typhoon, but only one had the imaginative sympathy to write it down.

As Martin Amis has said, we still have no real idea what it is like to go into space. No one who has done so has had the ability to write well about the experience. Whatever systematic analysis is undertaken of a human experience, still the novelist's human spread seems the most substantial, authentic, accurate account.
Psychologists can offer explanations of behaviour, but they can't explain why people sometimes act out of character, or against their own interests. Even so subtle an analyst of behaviour as Erving Goffman, say, would struggle to account for the moment at the end of Vanity Fair where Becky Sharp hands Amelia Osborne the letter, destroying her own interests. And yet we know it to be true in the deepest sense.

The writer Marc Abrahams has shared an amusing encounter with a psychologist, who told him: "Whenever any group of really good research psychologists gets together socially, after a few drinks they always – and I do mean always – talk about why novelists are so much better at it than we are."

It's true. No psychologist is as good a psychologist as Graham Greene, let alone Tolstoy. And it's also true that no social life contains the range and interest of a shelf of novels. We love our friends: human beings fascinate us endlessly; and to teach us how they work, there are always novels. I've never met anyone remotely like Emma Bovary, Miss Flite, or Belinda, the madcap genius of the Fourth Form at Malory Towers. But one day, they'll come along, and when they do, I'll recognise them instantly.

Tuesday 28 June 2011

Dravid and the art of defence


India's No. 3 is a living testament to the belief that you need application and will more than talent to succeed in sport
Sanjay Manjrekar
June 28, 2011
 

Rahul Dravid pulls on his way to 62, ACT XI v Indians, 1st day, Canberra, January 10, 2008
For a defensive batsman, Rahul Dravid is extraordinarily skilled at pulling the short ball © Getty Images
The pitch at Sabina Park was challenging and the Test match was in the balance, but Rahul Dravid would agree that a more experienced bowling attack would have tested him more. Dravid's 151 Tests against the 69 of the West Indian bowlers combined was always going to be a mismatch. But while this was not one of his best hundreds by any stretch of the imagination, it was an important one nevertheless, given the stage his career is at. And it allows us dwell a bit on the Dravid success story as he completes 15 years in international cricket.
To start with, success does not come as easily to Dravid as it seems to do to others: you get the feeling that he has had to work at it a little more.
I believe Dravid can be a more realistic batting role model for young Indian batsmen than a Tendulkar, Sehwag or VVS Laxman, for Dravid is the least gifted on that list. While Tendulkar is a prodigious, rare talent, Dravid's basic talent can be found in many, but what he has made of it is the rare, almost unbelievable, Dravid story. That you don't need to have great talent to become a sportsman is reinforced by Dravid's achievements over the last 15 years. And that he is now an all-time Indian batting great highlights his speciality: his ability to over-achieve. Indeed, he would have probably have performed beyond his talent in any profession of his choosing. Indian cricket is fortunate that he chose it.
For a batsman of his nature and skills, that he ended up playing 339 one-day internationals, and still contributes to his IPL team in Twenty20, shows his strength of mind. It is a mindset that sets almost unreasonably high goals for his talents to achieve and then wills the body on to achieve them.
Dravid is a defensive batsman who has made it in a cricket world that fashions and breeds attacking batsmen. If he had played in the '70s and '80s, life would have been easier for him. Those were times when a leave got nods of approval and admiration from the spectators.
Dravid has played the bulk of his cricket in an era when defensive batting is considered almost a handicap. This is why it is rare to see a defensive batsman come through the modern system. Young batsmen with a defensive batting mindset choose to turn themselves into attacking players, for becoming a defensive player in modern cricket is not considered a smart choice.
Not to say that Dravid has been all defensive, though. He has one shot that is uncommon in a defensive Indian batsman: the pull. It is a superb instinctive stroke against fast bowling, and it is a stroke Dravid has had from the outset; a shot that has bailed him out of many tight situations in Tests.
When I saw him at the start of his career, I must confess Dravid's attitude concerned me. As young cricketers, we were often reminded to not think too much - and also sometimes reprimanded by our coaches and senior team-mates for doing so. Being a thinker in cricket, it is argued, makes you complicate a game that is played best when it is kept simple. I thought Dravid was doing precisely that: thinking too much about his game, his flaws and so on. I once saw him shadow-playing a false shot that had got him out. No problem with that, everyone does it. Just that Dravid was rehearsing the shot at a dinner table in a restaurant! This trait in him made me wonder whether this man, who we all knew by then was going to be the next No. 3 for India, was going to over-think the game and throw it all away. He reminded me a bit of myself.



He has not committed the folly of being embarrassed about grinding when everyone around him is attacking and bringing the crowd to their feet. Once he is past 50, he resists the temptation to do anything different to quickly get to the next stage of the innings




Somewhere down the line, much to everyone's relief, I think Dravid managed to strike the right balance. He seemed to tone down the focus on his mistakes, and the obsession over his game and his technique, and started obsessing over success instead. Judging from all the success he has had over the years, I would like to think that Dravid, after his initial years, may have lightened up on his game. Perhaps he looks a lot more studious and intense on television to us than he actually is out there.
Dravid has to be the most well-read Indian cricketer I have come across, and it's not just books about cricket or sports he reads. I was surprised to discover that he had read Freedom at Midnight, about the partition of India, when he was 24. Trust me, this is very rare for a cricketer at that age. You won't find a more informed current cricketer than him - one who is well aware of how the world outside cricket operates.
Most of us cricketers develop some understanding of the world only well after we have quit the game. Until then, though experts of the game, we remain naïve about lots of things. I think this awareness of the outside world has helped Dravid put his pursuit of excellence in the game of his choice in perspective. At some point in his career he may have come to accept that cricket is just a sport and not a matter of life and death - even if he seemed prepared to work at it like it was.
Life isn't that easy, as I have said, for a defensive batsman in this age, when saving runs rather than taking wickets is the general approach of teams. A defensive batsman's forte is his ability to defend the good balls and hit the loose ones for four. But with bowlers these days often looking to curb batsmen with very defensive fields, batting becomes a bit of a struggle for players like Dravid.
It is a struggle he is content with, though. He has not committed the folly of being embarrassed about grinding when everyone around him is attacking and bringing the crowd to their feet. He is quite happy batting on 20 when his partner has raced to 60 in the same time. Once he is past 50, he seems to get into this "mental freeze" state, where it does not matter to him if he is stuck on 80 or 90 for an hour; he resists the temptation to do anything different to quickly get to the next stage of the innings. It is a temptation that many defensive batsmen succumb to after hours at the crease, when the patience starts to wear, and there is the temptation to hit over the infield, for example, to get a hundred. Dravid knows this is something that Sehwag can get away with, not him.
He has resisted that impulse and has developed the mind (the mind, again) to enjoy the simple task of meeting ball with bat, even if it does not result in runs, and he does this even when close to a Test hundred. The hundred does come eventually, and after it does, the same discipline continues - in that innings and the next one. A discipline that has now got him 12,215 runs in Test cricket.

Sunday 26 June 2011

Talent. Graft. Bottle?

Musa Okwonga:  The annual Wimbledon conundrum

The Independent
It's nerve; it's grit; it's the key ingredient that makes a true champion. As Andy Murray aims to break his Grand Slam duck, our writer gets to the root of what every winner needs
Sunday, 26 June 2011
Bottle. It's an odd word to describe the spirit that all athletes need when faced with unprecedented pressure, but it somehow seems to have stuck. There are several conflicting and convoluted suggestions as to its origin: the most recurrent is that "bottle" is derived from Cockney rhyming slang, "bottle and glass". If you've got plenty of "bottle and glass", so the slang goes, then that means that you've got plenty of "arse" when you're confronted with a career-defining test.
Bottle isn't like muscle: it's not visible to the naked eye. At first glance, most of the world's leading sportsmen and sportswomen look routinely impressive: fit, focused, intimidatingly intense. It's only when they're stepping towards that penalty spot or standing at that free-throw line that we get to peer beneath the veneer – to glance at the self-doubt that threatens to engulf them. And engulf them it does, time and again. Just look at Jana Novotna in the 1993 Wimbledon singles final, when she had a game point to go up 5-1 in the final set against Steffi Graf. Until that moment, we didn't know that Novotna would fold; maybe she didn't know, either. But a few games later, she was sobbing on the shoulder of the Duchess of Kent as Graf took the title.
We don't have to look beyond our shores to find ample examples of those who've bottled it. In football, there's the familiar litany of losses to Germany; to name but one, the 1990 World Cup, where England's Stuart Pearce hit his spot-kick into the goalkeeper's midriff and Chris Waddle sliced his high over the crossbar. More recently, in golf, and the sight of Rory McIlroy's surrender at Augusta in the 2011 Masters was especially spectacular. Leading by four shots heading into the final round, holding a one-shot advantage as he moved into the back nine, he then dropped six shots in three holes, finishing 10 shots behind the leader Charl Schwartzel and recording an eight-over-par score of 80.
However, McIlroy's reaction to his meltdown said much about his character, and about the nature of bottle. "Well that wasn't the plan!" he tweeted. "But you have to lose before you can win. This day will make me stronger in the end." Once he had experienced terror, and rapidly understood that the only factor holding him back was his own trepidation; he had laid the foundation for his eventual success. It's no coincidence that in his next major tournament, the 2011 US Open, he triumphed by eight shots.
McIlroy's astonishing response to his collapse shows that we can be unnecessarily harsh when we dismiss an athlete as a "bottler", as someone who'll never hold it together when it counts. For his entire cricket career, the England batsman Graeme Hick was accused of being a "flat-track bully", someone who was proficient against domestic teams but who lacked courage at international level. Hick's batting average in all matches, including a highest score of 405 not out, was 52.23, as against a Test average of 31.32. The history books therefore record a verdict of frailty at the highest level. A more striking case still is Mark Ramprakash, regarded as one of the finest technicians ever to have played the game, but whose performances for England fell far short of those for his counties of Surrey and Middlesex. To date, Ramprakash has over 100 first-class centuries, one of only 25 men to achieve that feat: his batting average in first-class matches stands at 54.59, while his Test career ended with an average of 27.32.
The statistics suggest that, in the cases of Hick and Ramprakash, their bottle was irreparably broken. Both can rightly point to the promise that they showed at Test level, having excelled on foreign soil: Hick can refer to his innings of 178 against India's spinners in Bombay, and Ramprakash can hold up his 154 against West Indian quicks in Barbados. But ultimately, the words of Mike Atherton, written in 2008 in The Times about Ramprakash, ring true for both of them. "Sport is neither just nor unjust," he opined; "it simply reflects time and again an absolute truth. Ramprakash was tried and tested many times in international cricket and more often than not he was found wanting."
Sian Beilock, an associate professor of psychology at the University of Chicago, in Choke: The Secret to Performing under Pressure says: "The more people practise under pressure, the less likely they will be to react negatively when the stress is on. This certainly seems to be true for professional golfers like Tiger Woods. To help Woods learn to block out distractions during critical times on the course, his father, Earl Woods, would drop golf bags, roll balls across Tiger's line of sight, and jingle change in his pocket. Getting Woods used to performing under stress helped him learn to focus and excel on the green."
This excellent practice served Woods well on his way to 14 major championships. But, as Beilock notes, there is no amount of rehearsal that can prepare you for pressure of unforeseen magnitude, such as Woods experienced after multiple revelations about his troubled private life.
If we know that bottle is so hard to have, then why are we so hard on those who don't have it? It's not as if we teach bottle in UK schools. You won't find classes in self-confidence in our curriculum or, as pop star Cher Lloyd has more recently dubbed it, "Swagger Jagger". No, we're too busy teaching humility to our athletes. As a nation, we are superb silver medallists. We smile politely on the podium and shake the winner's hand, when we should be snarling and tearing it off. And while bottle is not the same as arrogance, the two are closely related, both relying on a dogged belief in one's own ability, often in the face of reason.
Most British athletes who are regarded as bottlers are nothing of the sort. Instead, they are people who have risen far above their sporting station, who have gone beyond all reasonable expectation of their talent. Take Tim Henman, who went to six Grand Slam semi-finals, and who was at times a firm test for the all-time greatness of Pete Sampras. Take Andy Murray, who has finished as the runner-up in three Grand Slam finals, and who has the misfortune to be playing in the same era as the all-time greatness of Roger Federer and Rafael Nadal. Neither of these men are failures. They're very, very, very good at tennis, and their only crime is to have fallen short of the milestone of sporting immortality.
If you're a world-class athlete, it's best not to care too much what the public thinks. If you're too dominant, the public can't relate to you and find you boring. If you come second too often, it despairs of you. Your victories must be conspicuously hard-won. There must be graft alongside the grace, bottle alongside the brilliance. We want you to sweat every bit as much as you Swagger Jagger.
If you can master all of that, then we'll truly take you to our hearts. And it can't look too pretty. Tiger Woods's most memorable major victory was not winning the 1997 Masters aged only 21, but the 2008 US Open, with only one good leg. Dame Kelly Holmes is loved not so much because she was a double Olympic gold medallist at 800m and 1500m, but because we saw her strive for years, and, in those final races, for every last inch of her success.
When athletes crumple to defeat in such public spheres, they may lose titles, but they win our affection. That's why, when Rory McIlroy stepped off the 18th green at Congressional, he was not just the 2011 US Open Champion. He was something vastly more: he was our champion.
Musa Okwonga is author of 'A Cultured Left Foot' and 'Will You Manage?'

Friday 17 June 2011

What is talent in sport?

Is it just natural ability or the consistency that comes from perseverance?

Harsha Bhogle

June 17, 2011




My father believed - as was the norm with respectable middle-class families in the years gone by - it was important that his children were good at mathematics. If your child was good at mathematics, you had imparted the right education and fulfilled one of your primary duties as a parent.

He often quoted to us what his friend, a respected professor of the subject, used to say: "There should be no problem that you encounter in an examination for the first time." It meant you had to work so hard that you had, conceivably, attempted and vanquished every situation that could find its way into an exam paper. It begs the question: if you did achieve 150 out of 150 in an exam (which my wife very nearly did once, much to my awe), was it because you were extraordinarily intuitive or because you had worked harder than the others, so that you didn't "encounter any problem in an exam" for the first time?

In other words, is getting a "centum" (a peculiarly Tam Bram expression) a matter of genius or a matter of perseverance? It is an issue that many intelligent authors around the world have been debating for a while, and one that is at the heart of sport. Would anybody who solved a certain number of sums get full marks? Would two people, each of whom put in 10,000 hours (Malcolm Gladwell's threshold for achievement) produce identical results? Or are some people innately gifted, allowing them to cross that threshold sooner?

We pose that question a great deal in cricket when we argue about talent. Players who play certain shots - the perfectly balanced on-drive for example - are labelled "talented" and put into a separate category. They acquire a halo, and in a near-equal situation they tend to get picked first. "Talent" becomes this key they flash to gain entry. And yet it is worth asking what talent really is.

Is it the ability to play the on-drive or, more critically, the ability to play that on-drive consistently? It is a critical difference. Consistency brings in an element of perseverance that you do not normally bracket with talent.

Let me explain. I have often, while watching Rohit Sharma bat, said "wow" out loud. I probably said it because I saw him play a shot I did not expect him to. Or maybe it was a shot very few players were able to play. Just as often, I find myself going "ugh" with frustration at him. It is probably because, having had the opportunity to go "wow", I now expected him to play the same shot again. And so, without explicitly stating it, I am invoking the assumption of consistency to assess talent. The old professor of mathematics would have said, "Play the shot so often that it is no longer a new shot when you play it."

It is while I was debating this in my mind that I became aware of why Sachin Tendulkar paid such high compliments to Gary Kirsten for throwing him balls. Tendulkar wanted to perfect a shot and needed someone to throw him enough balls to attain that perfection, so that when he attempted it in a match he wasn't doing it for the first time. And in a recent conversation he said he was at his best when he was in the "subconscious", not distracted by the "conscious", and able to play by instinct - which he had perfected through practice.

Now we often call Tendulkar a genius, and yet, as we see, the talent that we believe comes dazzling through is, in essence, the product of many hours of perseverance. Is Tendulkar, then, the supreme example of my father's friend's theory of doing well at maths? And assuming for a moment that is true, shouldn't we be honouring perseverance because that is what it seems "talent" really is?

And so it follows that when we complain that all talented players don't get to where they should, we are in effect saying that they didn't practise hard enough to be consistent. Maybe it means we should all use the word "talent" more sparingly; not bestow it on a player until ability has been married to hard work long enough to achieve consistency.

This is also the starting premise of a new book I hope to continue reading - Bounce by the former table tennis champion Mathew Syed. I am delighted by its opening pages, one of which said "talent is overrated". It is something I have long believed.

Friday 10 June 2011

The song No Charge reminds us that Britain used to be less greedy

Those who believe the myth that 1970s Britain was 'the sick man of Europe' forget how progressive the decade was

Neil Clark
Neil Clark
guardian.co.uk, Friday 10 June 2011 10.30 BST



It's regarded by some as one of the slushiest No 1 records of all time. It's exactly 35 years ago this week that No Charge, sung by the Canadian artist JJ Barrie, got to No 1 in the British pop charts – and thanks to the wonders of BBC4, who are repeating Top of the Pops shows from 1976 on a weekly basis, we'll all be able to see it performed on our television screens next Monday.

Some won't be looking forward to it too much – in his Guardian article of a week ago, Alexis Petridis claimed that 1976 was the worst year for pop music ever.

But leaving aside debates about musical merit, what watching the repeats of Top of the Pops and other programmes from the same era on channels such as Yesterday, ITV3 and ITV4 shows us is what a less commercialised age the pre-Thatcherite 1970s were.

No Charge might be considered over-sentimental by some, but it is also a powerful critique of the mentality of putting a dollar sign on things we should be doing for free.

It's extremely unlikely that such a song would be released in the uber-capitalist Britain of today, let alone get to No 1. But in the progressive, left-leaning mid-1970s, it was always likely to be a hit.

Thanks to the glories of the "market economy", many things which were free, or at least very cheap, 35 years ago, cost a small fortune today. In 1976 you didn't have to book up months in advance to find a reasonable train fare from London to Liverpool, you just turned up on the day. Utility bills were not something to be feared in the days when publicly owned bodies and not profit-hungry private companies provided your electricity, gas and water.

Students going on to higher education did not have to worry about building up huge debts in order to pursue their studies. Neither did old people have to worry about selling their homes in order to finance going into care. And in those pre-Sky days, all the best sports – including live coverage of England's summer Test match series – could be watched on television for the very modest cost of the licence fee.

In short, in the social democratic Britain of the 1970s, No Charge was not just the name of a No 1 hit record, it summed up the ethos of the era – an era in which the interests of people came before corporate profits.

This aspect of the 1970s is often lost in accounts of the period. The dominant neoliberal narrative casts 1970s Britain as the "sick man of Europe" – a country rescued from the horrors of collectivism by the great saviour Margaret Thatcher. But even the liberal left have bought in to large parts of this rightwing myth, and have failed to stick up for the 1970s as much as they should. The fact that Britain went to the IMF in the autumn of 1976 is taken as proof that the postwar settlement had failed – even Denis Healey, chancellor at the time, has admitted: "We didn't really need the money at all."

Watching television programmes of the 1970s reminds us of the anti-capitalist values which were once mainstream. The year that No Charge got to No 1 saw the television debut of James Mitchell's drama series, When the Boat Comes In, which tells the story of trade union activist and strike organiser Jack Ford. The Onedin Line, currently being re-shown on the Yesterday channel, highlighted the greed of unscrupulous ship-owners and the terrible conditions that sailors had to endure in the 19th century. Upstairs Downstairs, another 70s classic being repeated on ITV3, showed how those "downstairs" saw their position improve in the 20th century. In Poldark, the title character takes the side of the poor against the greedy landowner and banker George Warleggan.

Since the days when those programmes were screened, we've seen the money-grabbing values of the City and Wall Street permeate all aspects of our lives. Who would have thought that water – which falls out of the sky for free – would become a tradable commodity, or that care homes would be owned by City investors?

While in the summer of 1976 we were listening to No Charge and enjoying the lowest levels of inequality in our history, in the grossly unequal Britain of June 2011, we're tuning into The Apprentice. The proto-Thatcherite little boy in No Charge – who wants to bill his mum $5 for "mowin the lawn" and $1 for "takin out the trash" – rightly gets corrected: today he'd probably be lauded as a brilliant up-and-coming entrepreneur.

Neoliberals want us to believe that "market forces" are the only show in town. But watching 1970s television programmes gives us a window into a world where things were different. It's not possible to turn the clock back to 1976, but we can make the title of JJ Barrie's No 1 hit record the slogan for a better and less commercialised Britain.

Is Monogamy Obsolete? New Books Challenge Our Ideas of Fidelity

by Jessica Bennett
June 9, 2011 | 12:59am

Anthony Weiner may insist his marriage isn't over, but we've seen this situation play out before. Wives leave husbands, the public condemns the cheating—and, inevitably, six months later, we learn about another scandal. Jessica Bennett on why we need to rethink our notions of fidelity.

As the urban legend goes, the woman is so desperate for a proposal that she cuts out magazine ads of diamond rings and wears them on her finger. In another tale, a girl marks up her calendar with “DID NOT PROPOSE” for each day her boyfriend puts off the looming question. If you judge by the number of Bridezilla shows on television—as well as the thousands of women who’ve made Lori Gottlieb’s Marry Him! a bestseller—it’s easy to assume that Americans are just dying to say "I Do."

The reality, of course, is that "I Do" is often followed by "I cheated." And it requires little more than the flip of the remote to find out all the gory details. Call girls. Prostitution. Sexting. A love child. Inevitably, we see wives leave husbands, and public condemnation—and watch it happen all over again six months later. The stories have become so common we could argue doing away with marriage altogether—and many have. "Is it obsolete?" wondered The Atlantic. "It's unnecessary," proclaimed Newsweek. Now new Census data reveal that, for the first time, married couples are no longer the majority. As one sociologist told me recently, speaking at a conference on polyamory: "The system simply isn't working."

But Pamela Haag, the author of Marriage Confidential, isn't so quick to call the whole thing off. Marriage is changing, she contends. But rather than giving up on it, why not simply redefine it in a way that works for each of us? Haag cites research showing that 65 percent of women—and a whopping 80 percent of men—say they’d cheat if they knew they wouldn’t get caught. She spends time with couples whose relationships she deems “Oreo marriages”—traditional on the outside, but secretly transgressive on the inside. She describes “parenting marriages,” centered around the kids; the “life partner," who is perhaps more like a best friend than a romantic partner. And, most interestingly, she talks to couples who are working infidelity into their unions, instead of struggling to keep it out. Marriage, she says, isn't dying—it's just changing. "It’s just getting revised for this century," she says.

Many of these couples are what Haag calls the “new monogamists.” She interviews women who hack into their husbands’ emails, those who stray emotionally with online partners they may never meet, as well as those who are OK with it all, employing codes like “the 50-mile rule” (affairs allowed beyond 50 miles of the home) or marriage “sabbaticals” for those who really want a break. Like Weiner, many learn of their partners' indiscretions online. Others employ “don’t ask don’t tell” rules. Still others find out, and simply don't care. “The big romantic standard has always been one strike and you’re out,” says Haag. “But I really think that’s opening up."

Photos: A History of Multi-Partner Relationships

Article - More Ways Than Two GAL LAUNCH

It all sounds terribly transgressive—or unromantic. Except that these families aren’t freaks or outcasts, they’re starting to become the norm. (See: Is Polyamory America’s Next Sexual Revolution?) Haag notes that as many as 4 million married Americans consider themselves swingers—and the number of swing clubs in this country has doubled over the past 10 years. Over the past three years, books like Open by journalist Jenny Block, Opening Up by sex columnist Tristan Taormino, and support from the likes of celebs like Tilda Swinton and Warren Buffett have put open marriage on the map. (When asked, in 2009, how he made his open marriage work, Buffett replied cooly, “you have to be secure.”)

“Humans aren’t monogamous, we need to get over that,” says Ken Haslam, a retired anesthesiologist who curates a library at the Kinsey Institute. “We fool around. We do! And if you don’t fool around, you want to fool around.”

There are now online forums for acting polyamorists, a magazine called Loving More that has 15,000 subscribers, perhaps and somewhat surprisingly, the results of a 14,000-person Oprah.com survey—in which 21 percent of people said they have an open marriage. All of that got Haag thinking: Should we stop calling infidelity a problem, and think of it as the future? "Marital nonmonogamy may be to the 21st century what premarital sex was to the 20th," she writes—"a behavior that shifts gradually from proscribed and limited, to tolerated and increasingly common."

She wouldn’t be the first to suggest it: Researchers have long wondered whether monogamy is outdated. (Helen Fisher, who studies the nature of love, believes humans aren’t meant to be together forever—but in short-term, monogamous relationships of three or four years.) Even as far back as the 1950s, Kinsey was noting that 26 percent of married women admitted to having an affair by age 40, and an additional 20 percent had engaged in petting without intercourse, despite the assumption being that it’s men who most often cheat. More surprisingly, 71 percent of the women in this group reported no difficulties with their marriage—even though half said their husbands either knew or suspected there was something going on. "Humans aren't monogamous, we need to get over that," says Ken Haslam, a retired anesthesiologist who curates a library at the Kinsey Institute. "We fool around. We do! And if you don't fool around, you want to fool around."

And yet monogamy is still the deeply ingrained—or delusional—rule to living happily ever after, and our views toward infidelity are comically naïve. "We cheat—and we also roundly disapprove of cheating," Haag writes—to the extent that we find the action more reprehensible than human cloning (really). It's the ultimate hypocrisy—lodged into every corner of our social existence, leading to the downfall of politicians, executives, religious clerics, athletes… the list goes on. It depends on what survey you examine, but more than half of Americans cheat, and yet 70 to 85 percent of adults think cheating is wrong. "We are fooling ourselves if we think people are as against cheating as they say they are,” says Jenny Block. “Jude Law cheated on Sienna Miller, for God's sake. JFK cheated on Jackie. Have we learned nothing from these scandals?”

Surely everyone in a relationship wrestles at some point with an eternal question: Can one person really satisfy every need? What we’ve learned, it turns out, is that the answer may be no. But if you believe Haag, that doesn’t mean the end of marriage—it simply means a revision of our norms. “Giving ourselves the license and permission to evolve marriage is perhaps the unique challenge of our time,” she writes. In other words: Weiner may indeed be an ass. But, as Haag puts it, perhaps we can have our cake and eat it, too. Let's just be honest about our marital motives.

Thursday 26 May 2011

Who's in control? Not just governments, that's for sure



Andreas Whittam Smith in The Independent


How did the media on the one hand and the financial markets on the other build themselves up as such great forces in society?

Thursday, 26 May 2011

We have seen two big powers in action this week. They are not countries. But they can take on governments and win. They are the media and the financial markets. While British newspapers were forcing the UK Government to rethink the use of injunctions issued by courts to protect privacy, the financial markets were maintaining almost unbearable pressure on the currencies of the weaker members of the eurozone.
It was surely resentment at the power of the media that led a gang of masked men to vandalise reporters' cars outside the home of Ryan Giggs on Tuesday. An injunction taken out by Mr Giggs to prevent press coverage of an alleged extramarital affair had been dramatically revealed in the House of Commons on Monday after details had been made freely available on Twitter.
As to the power of the financial markets, my colleague Hamish McRae noted yesterday that the Greek government "may have the electorate's mandate but it does not set policy. That is being determined... in Brussels, Berlin, Frankfurt and Washington. Power has gone". In turn the politicians and civil servants in such cities look to the financial markets to discover what actions are required. Take another case, Spain. As the Financial Times commented: "Watching Spain's agony as it tries to escape the clutches of the eurozone's expanding sovereign debt crisis is like being a spectator at a particularly cruel gladiatorial fight. Whenever the weaker contestant skilfully sidesteps an assault by his opponent, he is promptly confronted with a still more ferocious attack."
How did the media and the financial markets build themselves up as great powers? The most significant date in plotting the growing influence of national newspapers in Britain was 17 November 1969 when Rupert Murdoch launched the Sun as a tabloid. Thirteen years later Associated Newspapers created the Mail On Sunday. In little over a decade, therefore, the market for scandalous news had been substantially expanded. Until then the Daily Mirror and the News Of The World had dominated this area.
This was an era when everything began to change, as much in the financial markets as in the behaviour of the media. Governments strictly controlled exchange rates, for instance, until the early 1970s. When US President Richard Nixon closed the so-called "gold window" on 15 August 1971, ending free exchange between US dollars and gold, he brought to a close a 25-year period during which the world's leading currencies, including sterling, had been fixed in terms of the dollar. Speculation against them had been almost impossible.
From then onwards they could "float", and when a particular currency declined in value against its neighbours', the government concerned began to feel the pressure. In 1979, one of the first decisions of Mrs Thatcher's newly formed government was to abolish UK exchange control. It was a welcome act of self-confident liberation, but it also, in accordance with the law of unintended consequences, handed a weapon to currency speculators, who would use it ruthlessly in 1992 to drive Britain out of the European Exchange Rate Mechanism on a day known forever afterwards as "Black Wednesday".
Repeated oil shocks since the 1970s have also contributed to the power of financial markets. Essentially a higher oil price takes spending power out of the pockets of consumers and places it into the treasuries of countries, mainly in the Middle East, who have no means of spending their new wealth – other than by investing it back into the financial markets of the West. By this mechanism the financial markets have become larger and larger in relation to national economies. Since the first oil shock in 1973 when the price of oil shot up to $10 a barrel – it's now $100 – there have been at least a dozen oil spikes, each time magnifying the size of the financial markets as the unspent surplus was invested in securities.
During the same period, the power of the media has also continued to increase. In the UK, the politicians partly brought this on themselves. From the early 1990s they began a process of non-stop electioneering. So great are the penalties for losing power – the splitting of the party into warring groups, the lengthy period in exile – that party leaders feel they must do what it takes to regain or retain office.
In relationship to the press, Tony Blair described what was needed: "Our news today is instant, hostile to subtlety or qualification... To avoid misinterpretation, strip down a policy or opinion to one key clear line before the media does it for you. Think in headlines." Then when Labour came to power in 1997, the Government Information Service was taught the same rule. Alastair Campbell told Whitehall press officers a few months after the election: "Decide your headlines. Sell your story and if you disagree with what is being written, argue your case." But the more the political parties sought control, the more aggressively the press struck back.
Add to this the dramatic expansion of unregulated digital media. The first email was sent in 1971 (the two computers were sitting next to each other!). The first web browsers became available in 1978. The first social networking site saw the light of day in in 1994. MySpace was created in 2003, Facebook in 2004 and Twitter in 2006.
I don't describe the rise of the media and the financial markets to positions of great power to argue that something should be done about them, though they are both, in their different ways, crude and rough. I particularly dislike the untrammeled greed of bankers though doubtless they equally hate the untrammeled inquisitiveness of journalists. Where the power of media and finance is at its most objectionable, however, is in their ability to deter governments from protecting us from their worst excesses.
In the United States the banks, for instance, use their formidable lobbying skills and resources in Congress to deter lawmakers from curbing their abuses and this phenomenon in turn has the effect of holding back regulation in other markets around the world. In Britain, so far as the media is concerned, there is a strong case for a law on privacy, but I doubt whether any Cabinet would have the courage to propose such a measure. Of course even democratically elected governments can be frightening bodies, but so are their most formidable opponents, finance and media.

Thursday 14 April 2011

Iceland broke the rules and got away with it

Now Ireland and Portugal wish they too had got tough with the markets


* Aditya Chakrabortty
o The Guardian, Tuesday 12 April 2011


Remember Iceland? In the autumn of 2008, it became the first national casualty of the financial meltdown; the first rich country in more than three decades to take an IMF bailout. Commentators declared it the Icarus economy, which had finally come crashing back down to earth. It became both parable and laughing stock. What's the difference between Iceland and Ireland, joked traders – one letter and a few months.

You don't hear much about the insolvent island any more – apart from occasions such as this weekend, when Icelandic voters were asked to repay the £3.5bn owing on collapsed bank Icesave, and replied with a firm "Nei".

Unnoticed it may be, but Reykjavik now serves as a very different kind of parable, of how to minimise the misery of financial collapse by ignoring economic orthodoxy. And in those other broke European economies – from Dublin to Athens to Lisbon – politicians and voters are starting to pay attention. After its three biggest banks – 85% of the country's financial system – failed in the same week, Iceland did two remarkable things. First, it let the banks go under: foreign financiers who had lent to Reykjavik institutions at their own risk didn't get a single krona back. Second, officials imposed capital controls, making it harder for hot-money merchants to pull their cash out of the country.

These policies were not just controversial; they represented a two-fingered salute to the polite society of academics and policy-makers who normally lay down the laws on economic disaster management.

Compare Iceland's policies with those followed by another tiny country in the North Atlantic, which also has a banking industry much bigger than its national economy. When the credit crunch came to Dublin, the government decided to underwrite the entire banking industry – including tens of billions of euros of loans made by foreign investors. That landed the country with a debt worth something like €80,000 for every household – a debt that effectively bankrupted the country.

"A reverse Robin Hood – taking money from the poor and giving to the rich," is how Anne Sibert, a member of the Central Bank of Iceland's monetary policy committee, describes the Irish policy. But Dublin was merely following the old free-market tradition that rules governments should never break faith with financiers.

Yet looking at the two countries now, it's hard to say that Ireland has prospered out of being orthodox, or that Iceland has suffered an especially terrible punishment for not sticking to the Way of the Markets.

Indeed, the evidence seems to point the opposite way: Iceland has come through in better condition than anyone in 2008 dared hope. The worst of its recession is over, even though it's still too early to talk about sustained growth, and the unemployment rate (7.5%) is just over half that of Ireland (13.6%). Remarkably, after the krona lost more than half its face value, inflation is also coming down quite sharply. And without having to pay back foreign creditors, the government's finances are also in better shape. In Ireland, on the other hand, the government has just injected more money into its banking sector – the fifth time it has had to do so.

Now, this is a picture that needs more qualifications than a brain surgeon. For a start, you wouldn't wish Iceland's fate on any economy. Huge spending cuts are still to kick in, and a lot more pain is in store. Thor Gylfason, an economist at the University of Iceland, reckons it will take another seven to 10 years before his country recovers from one of the worst economic disasters in recent history. This will be a long, slow haul.

But landed with an almost unbearable burden, Iceland has made the load easier on itself – and it has done so by getting tough with foreign speculators who lent money to the country at their own risk. In Dublin, on the other hand, as Irish MP Stephen Donnelly puts it, "the entire Irish people were made collateral for the banking system" – and its economic performance has not been remarkably better. More than that, there is a basic point about fairness: in Ireland, keeping the markets on side was deemed to be more important than keeping people in jobs – in Iceland, the priorities have been reversed.

Donnelly says that the Icelandic example is beginning to attract interest in the Dáil and in the media. An Icelandic politician was recently interviewed by Vincent Browne, the Irish equivalent of Jeremy Paxman. In the bust countries of southern Europe they're also starting to take notice. Last week, on the day that Portugal finally admitted it would need a bailout from Brussels, I was talking to Joana Gorjão Henriques, a journalist from Lisbon. She told me that her contacts were pasting stories about Iceland on Facebook, and that newspaper columnists were using Iceland's case as an example that Portugal, Greece and Ireland should follow – make an allegiance and say to the EU that they won't pay the debt.

There are echoes here of the Asian financial crisis of the late 90s. Then Malaysia's prime minister Mahathir Mohamad brought in capital controls to shore up a battered financial system – and he was pilloried from Washington to Wall Street. Nobel laureates in economics predicted imminent catastrophe for Malaysia; the International Monetary Fund effectively told Mohamad off. But the year after, Malaysia began a strong economic recovery, and now the IMF issues papers on the usefulness of capital controls.

Iceland was a country wrecked by implementing free-market dogma crudely and quickly; it may yet became another such lesson of how an economy can ignore free-market dogma – and come out far better than its critics predicted.

Spin and the art of attack

Being an aggressive spinner is not about bowling flat and fast. Quite the opposite, and you only have to look at veteran bowlers operate in Twenty20 to see that

Aakash Chopra

April 14, 2011

Grounds are getting smaller, wickets flatter, bats thicker. And just to make it tougher for bowlers, the format of the game has got shorter. As if the rule book doesn't already favour batsmen, these "innovations" have stacked the odds against bowlers higher still. But while it has ostensibly become tougher for bowlers to succeed, good ones will always have their say.

Who are these "good bowlers", though? In the pre-Twenty20 era these were men who could simply bowl quick, for a batsman needed special skills to get on top of someone bowling at 145kph. It was widely believed that the shorter the format, the smaller the role of a spinner. In fact, the only way for a spinner to survive in Twenty20 was to bowl quick and flat, or so it was believed for the longest time.

But a look at the spinners in action in the current IPL is enough to tell you an entirely different story. Spinners who're bowling slower in the air are ruling the roost.

Is it only about bowling slow or it there more to it? Let's take a look at what's making these bowlers ever so successful.

A big heart
While fast bowlers are the stingy kind, who hate runs being scored off them, spinners are more often than not advised to be generous and to always be prepared to get hit. Bishan Bedi would tell his wards that a straight six is always hit off a good ball and one should never feel bad about it. Having a big heart doesn't mean that you should stop caring about getting hit; rather that you shouldn't chicken out and start bowling flatter.

Suraj Randiv could easily have bowled flatter when he was hit for two consecutive sixes by Manoj Tiwary in the first match, but he showed the heart to flight another delivery, and got the better of his opponent. You rarely see Daniel Vettori or Shane Warne take a step back whenever they come under the hammer. Instead of thinking of ways to restrict damage, they try to plot a dismissal.

Slow down the pace
Most young spinners don't realise that the quicker one bowls, the easier it gets for the batsman, who doesn't have to move his feet to get to the pitch of the ball and smother the spin. You can do fairly well while staying inside the crease, and small errors of judgement aren't fatal either, as long as you're playing straight.

The slower the delivery, the tougher it is to generate power to clear the fence, but there are no such issues if the bowler is sending down darts. In fact, even rotating the strike is a lot easier if the bowler is bowling quicker, for you need great hands to manoeuvre the slower delivery.

Yes, it is perhaps easier to find the blockhole if you're bowling quick, but you're not really going to get under the bat and bowl the batsman, since you don't have that sort of pace.

Also, if you bowl quick, the chances of getting turn off the surface (unless it is really abrasive) are minimal. You must flight the ball and bowl slow to allow the ball to grip the pitch and get purchase.

Attack the batsman
Mushtaq Ahmed tells young spinners that they need to have the attitude of fast bowlers to attack batsmen.

Attacking the batsman is often misinterpreted as bowling quick. That's what the fast men do; you hit them for a six and you're almost guaranteed a bouncer next ball. For a spinner, attacking has a different meaning - going slower and enticing the batsman.

Bowling slow must not be confused with giving the ball more air. The trajectory can still remain flat while bowling slow. Some batsmen are quick to put on their dancing shoes the moment the ball goes above eye level while standing in the stance, so it's important to keep the ball below their eye level and yet not bowl quick. Vettori does it with consummate ease and reaps rewards. He rarely bowls quick; he relies on changing the pace and length to deceive the batsman. And if the batsman is rooted to the crease and is reluctant to use his feet, you can flight the ball.

Add variety
The best way to not just survive but thrive as a spinner is to keep evolving.

Anil Kumble not only slowed down his pace but also added a googly to his armoury in the latter half of his career. Muttiah Muralitharan added another dimension to his bowling when he started bowling the doosra. Suraj Randiv stands tall at the crease and extracts a bit of extra bounce. Ravi Ashwin has mastered the carrom ball.

Instead of learning to undercut the ball (which is obviously a lot easier to develop), it's worth developing a doosra, a carrom ball or some other variety.

Young kids must understand that when you bowl flatter-faster deliveries, the ball behaves somewhat like a hard ball does on a concrete surface, skidding off the pitch. Slower balls, on the other hand, act like tennis balls, with far more bounce.

Up and coming spinners need to set their priorities right. They can either bend the front knee to reduce height while taking the arm away from the ear to bowl darts, or learn from the likes of Warne, Vettori, Murali and the like to succeed in all formats, provided the basics are right.

Saturday 26 March 2011

Teach history warts and all

By Christopher Caldwell

Published: March 25 2011 23:10 | Last updated: March 25 2011 23:10

“Time to head off!” wisecracks the hooded executioner on the cover of Even More Terrible Tudors, one of the popular titles in the Horrible Histories series. “I’ve got a mammoth brain!” grunts a caveman on the cover of The Savage Stone Age, holding up the dripping organ in question, while his family, sitting in the background, cooks the rest of the mammoth. History-minded schoolboys buy these books – written or co-written by the Englishman Terry Deary and aimed at presenting “history with the nasty bits left in” – by the dozens.

The idea that the history of one’s own country should be as exhilarating to young readers as, say, cars exploding or ladies in bathing suits is a peculiarly British one. When Michael Gove, education secretary, told Conservatives at their party conference last October that the narrative of children’s history courses could stand to be a bit snappier, he started an argument that has riled British historians ever since. If people are uninspired by the country’s past, Mr Gove says, “we will not properly value the liberties of the present”. Mr Gove is nationalistic to say so, but he is right. If defending one’s rights requires knowing where they came from, then learning one’s own history is indispensable.

The argument is over how best to breathe life into a mass of facts and dates. For Mr Gove, the missing element is a strong narrative, built of real protagonists facing big challenges. The government enlisted as its history adviser Simon Schama of Columbia University (and the Financial Times), who has found a way to make European and British history enthralling, both in books and on screen. Mr Gove and Mr Schama have their detractors, however. The University of California historian of Britain, James Vernon, believes teaching works best “not by turning schoolchildren into Britons but by enabling them to analyse the present and to think critically”. Richard Evans, the Cambridge historian of Germany, is not hostile to the narrative lines dear to Mr Gove and Mr Schama, but warns us against getting swept up in them. In a recent London Review of Books essay, he urges scepticism towards sources and warns students “not to accept passively every fact and argument they are presented with”.

This is the point on which Mr Schama and Mr Evans are most likely to agree. Mr Schama, too, has described history as a force for challenging orthodoxies, as the “greatest, least sentimental, least politically correct tutor of tolerance”. And yet, this may be the point on which classroom teachers have their deepest doubts. The intellectual independence that Messrs Schama and Evans extol characterises only a minority of published historians – why should we expect it from A-level students? Should we even want it? There are, after all, problems with teaching scepticism. The questioning of authority is indispensable and often heroic, but one needs a certain “feel” for a subject matter before one can carry it out. Until that point, scepticism is little more than a truculent contrarianism and a waste of other students’ time. It is most tellingly applied to the things one knows best. Where ignorance and scepticism meet, a course on British history becomes a course on running Britain down.

One wonders whether this is not Mr Gove and Mr Schama’s real gripe. Mr Evans accuses them of “confusing history with memory”. But maybe memory is what young people need to be taught before they can be taught actual history. An example of this memory/ history distinction comes from Black History Month, as it is taught in US grade schools. Children spend every February either learning or rehashing the achievements of African-Americans – always in a morale-boosting way. As history, such courses have little to recommend them. To treat the deeds of the 19th-century abolitionist Sojourner Truth in greater detail than those of George Washington, which is the inevitable end-result of a dedicated month, is to perpetrate a distortion.

But as memory, Black History Month has been a striking success. Children, and not just black children, quite like it. The reasons are paradoxical. Probably no pedagogical innovation was ever carried out for reasons more political, but Black History Month is the least politically correct corner of the grade-school history curriculum. You always know who the good guys are in Black History Month and their struggles are taught with an old-fashioned, un-nuanced moralism that makes Our Island Story look like Hamlet. The results are plain to see. In 2008, education professors from Stanford and the University of Maryland released a survey of 2,000 11th- and 12th-graders (high-school leavers) who had been asked to name the 10 most significant Americans, excepting presidents. Three mainstays of Black History Month – Martin Luther King, the anti-segregationist protester Rosa Parks and the escaped slave Harriet Tubman – ranked one, two and three, well ahead of Benjamin Franklin, Thomas Edison and Henry Ford.

By about the age of eight or 10, children should have a simple, logical and non-cynical narrative of their country to carry around for the rest of their lives as a net to catch knowledge in. Non-cynical, because children cannot build such a net if teachers are running down the credibility of what they impart. That is the problem with teaching young people: there is a line on one side of which a teacher’s duty is to promote credulity and on the other side of which it is to promote scepticism. Errors are inevitable. But they will be self-correcting, to some extent. By age 16, students will have as much cynicism and “distance” as any educator could wish.

The writer is a senior editor at The Weekly Standard

Tuesday 8 March 2011

Spinoza, part 1: Philosophy as a way of life

For this 17th century outsider, philosophy is like a spiritual practice, whose goal is happiness and liberation

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 7 February 2011 09.30 GMT
o larger | smaller
o Article history

Spinoza memorial at the New Church in the Hague Spinoza memorial at the New Church in The Hague. Photograph: Dan Chung for the Guardian

Although Baruch Spinoza is one of the great thinkers of the European philosophical tradition, he was not a professional scholar – he earned his modest living as a lens grinder. So, unlike many thinkers of his time, he was unconstrained by allegiance to a church, university or royal court. He was free to be faithful to the pursuit of truth. This gives his philosophy a remarkable originality and intellectual purity – and it also led to controversy and charges of heresy. In the 19th century, and perhaps even more recently, "Spinozist" was still a term of abuse among intellectuals.

In a sense, Spinoza was always an outsider – and this independence is precisely what enabled him to see through the confusions, prejudices and superstitions that prevailed in the 17th century, and to gain a fresh and radical perspective on various philosophical and religious issues. He was born, in 1632, to Jewish Portuguese parents who had fled to Amsterdam to escape persecution, so from the very beginning he was never quite a native, never completely at home. Although Spinoza was an excellent student in the Jewish schools he attended, he came to be regarded by the leaders of his community as a dangerous influence. At the age of 24 he was excluded from the Amsterdam synagogue for his "intolerable" views and practices.

Spinoza's most famous and provocative idea is that God is not the creator of the world, but that the world is part of God. This is often identified as pantheism, the doctrine that God and the world are the same thing – which conflicts with both Jewish and Christian teachings. Pantheism can be traced back to ancient Greek thought: it was probably advocated by some pre-Socratic philosophers, as well as by the Stoics. But although Spinoza – who admired many aspects of Stoicism – is regarded as the chief source of modern pantheism, he does, in fact, want to maintain the distinction between God and the world.

His originality lies in the nature of this distinction. God and the world are not two different entities, he argues, but two different aspects of a single reality. Over the next few weeks we will examine this view in more detail and consider its implications for human life. Since Spinoza presents a radical alternative to the Cartesian philosophy that has shaped our intellectual and cultural heritage, exploring his ideas may lead us to question some of our deepest assumptions.

One of the most important and distinctive features of Spinoza's philosophy is that it is practical through and through. His ideas are never merely intellectual constructions, but lead directly to a certain way of life. This is evidenced by the fact that his greatest work, which combines metaphysics, theology, epistemology, and human psychology, is called Ethics. In this book, Spinoza argues that the way to "blessedness" or "salvation" for each person involves an expansion of the mind towards an intuitive understanding of God, of the whole of nature and its laws. In other words, philosophy for Spinoza is like a spiritual practice, whose goal is happiness and liberation.

The ethical orientation of Spinoza's thought is also reflected in his own nature and conduct. Unlike most of the great philosophers, Spinoza has a reputation for living an exemplary, almost saintly life, characterised by modesty, gentleness, integrity, intellectual courage, disregard for wealth and a lack of worldly ambition. According to Bertrand Russell, Spinoza was "the noblest and most lovable of the great philosophers". Although his ideas were despised by many of his contemporaries, he attracted a number of devoted followers who gathered regularly at his home in Amsterdam to discuss his philosophy. These friends made sure that Spinoza's Ethics was published soon after his death in 1677.

Spinoza, part 2: Miracles and God's will

Spinoza's belief that miracles were an unexplained act of nature, not proof of God, proved dangerous and controversial

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 14 February 2011 09.00 GMT
o larger | smaller
o Article history

At the heart of Baruch Spinoza's philosophy is a challenge to the traditional Judeo-Christian view of the relationship between God and the world. While the Hebrew Bible and the Christian scriptures share a conception of God as the creator of the natural world and the director of human history, Spinoza argues that everything that exists is an aspect of God that expresses something of the divine nature. This idea that God is not separate from the world is expounded systematically in the Ethics, Spinoza's magnum opus. However, a more accessible introduction to Spinoza's view of the relationship between God and nature can be found in his discussion of miracles in an earlier text, the Theologico-Political Treatise. This book presents an innovative interpretation of the bible that undermines its authority as a source of truth, and questions the traditional understanding of prophecy, miracles and the divine law.

In chapter six of the Theologico-Political Treatise, Spinoza addresses the "confused ideas of the vulgar" on the subject of miracles. Ordinary people tend to regard apparently miraculous events – phenomena that seem to interrupt and conflict with the usual order of nature – as evidence of God's presence and activity. In fact, it is not just "the vulgar" who hold this view: throughout history, theologians have appealed to miracles to justify religious belief, and some continue to do so today.

For Spinoza, however, talk of miracles is evidence not of divine power, but of human ignorance. An event that appears to contravene the laws of nature is, he argues, simply a natural event whose cause is not yet understood. Underlying this view is the idea that God is not a transcendent being who can suspend nature's laws and intervene in its normal operations. On the contrary, "divine providence is identical with the course of nature". Spinoza argues that nature has a fixed and eternal order that cannot be contravened. What is usually, with a misguided anthropomorphism, called the will of God is in fact nothing other than this unchanging natural order.

From this it follows that God's presence and character is revealed not through apparently miraculous, supernatural events, but through nature itself. As Spinoza puts it: "God's nature and existence, and consequently His providence, cannot be known from miracles, but can all be much better perceived from the fixed and immutable order of nature."

Of course, this view has serious consequences for the interpretation of scripture, since both the Old and New Testaments include many descriptions of miraculous events. Spinoza does not simply dismiss these biblical narratives, but he argues that educated modern readers must distinguish between the opinions and customs of those who witnessed and recorded miracles, and what actually happened. Challenging the literal interpretation of scripture that prevailed in his times, Spinoza insists that "many things are narrated in Scripture as real, and were believed to be real, which were in fact only symbolic and imaginary".

This may seem reasonable enough to many contemporary religious believers, but Spinoza's attitude to the Bible was far ahead of its time. Today we take for granted a certain degree of cultural relativism, and most of us are ready to accept that ancient peoples understood the world differently from us, and therefore had different ideas about natural and divine causation. When it was first published in 1670, however, the Theologico-Political Treatise provoked widespread protest and condemnation. In fact, it was this reaction that made Spinoza decide to delay publication of the Ethics until after his death, to avoid more trouble.

But what are we to make of Spinoza's claim that God's will and natural law are one and the same thing? There are different ways to interpret this idea, some more conducive to religious belief than others. On the one hand, if God and nature are identical then perhaps the concept of God becomes dispensable. Why not simply abandon the idea of God altogether, and focus on improving our understanding of nature through scientific enquiry? On the other hand, Spinoza seems to be suggesting that God's role in our everyday lives is more constant, immediate and direct than for those who rely on miraculous, out-of-the-ordinary events as signs of divine activity.

And of course, the idea that the order of nature reveals the existence and essence of God leads straight to the view that nature is divine, and should be valued and even revered as such. In this way, Spinoza was an important influence on the 19th-century Romantic poets. Indeed, Spinoza's philosophy seems to bring together the Romantic and scientific worldviews, since it gives us reason both to love the natural world, and to improve our understanding of its laws.

Spinoza, part 3: What God is not

In his Ethics, Spinoza wanted to liberate readers from the dangers of ascribing human traits to God

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 21 February 2011 08.30 GMT
o larger | smaller
o Article history

Spinoza's Ethics is divided into five books, and the first of these presents an idiosyncratic philosophical argument about the existence and nature of God. We'll examine this in detail next week, but first we need to look more closely at how the Ethics challenges traditional Judeo-Christian belief in God.

The view that Spinoza wants to reject can be summed up in one word: anthropomorphism. This means attributing human characteristics to something non-human – typically, to plants or animals, or to God. There are several important implications of Spinoza's denial of anthropomorphism. First, he argues that it is wrong to think of God as possessing an intellect and a will. In fact, Spinoza's God is an entirely impersonal power, and this means that he cannot respond to human beings' requests, needs and demands. Such a God neither rewards nor punishes – and this insight rids religious belief of fear and moralism.

Second, God does not act according to reasons or purposes. In refusing this teleological conception of God, Spinoza challenged a fundamental tenet of western thought. The idea that a given phenomenon can be explained and understood with reference to a goal or purpose is a cornerstone of Aristotle's philosophy, and medieval theologians found this fitted very neatly with the biblical narrative of God's creation of the world. Aristotle's teleological account of nature was, then, adapted to the Christian doctrine of a God who made the world according to a certain plan, analogous to a human craftsman who makes artefacts to fulfil certain purposes. Typically, human values and aspirations played a prominent role in these interpretations of divine activity.

Spinoza concludes book one of the Ethics by dismissing this world view as mere "prejudice" and "superstition". Human beings, he suggests, "consider all natural things as means to their own advantage", and because of this they believe in "a ruler of nature, endowed with human freedom, who had taken care of all things for them, and made all things for their use". Moreover, people ascribe to this divine ruler their own characters and mental states, conceiving God as angry or loving, merciful or vengeful. "So it has happened that each person has thought up from his own temperament different ways of worshiping God, so that God might love him above all others, and direct the whole of nature according to the needs of his blind desire and insatiable greed," writes Spinoza.

It is interesting to compare this critique of religious "superstition" with the views of the 18th-century Scottish philosopher David Hume. In his Dialogues Concerning Natural Religion, Hume challenges the popular belief in a creator God – and he also, elsewhere, undermines appeals to miracles as evidence of divine activity. Although Hume seems to echo Spinoza on these points, there is a crucial difference between the two philosophers. Hume thinks that many aspects of Christian belief are silly and incoherent, but his alternative to such "superstition" is a healthy scepticism, which recognises that religious doctrines cannot be justified by reason or by experience. His own position is rather ambiguous, but it involves a modest and pragmatic attitude to truth and seems to lead to agnosticism.

Spinoza, on the other hand, thinks that there is a true conception of God which is accessible to human intelligence. He argues that misguided religious beliefs are dangerous precisely because they obscure this truth, and thus prevent human beings from attaining genuine happiness, or "blessedness". There is, therefore, more at stake in Spinoza's critique of popular superstition than in Hume's. For Hume, religious believers are probably wrong, but the existential consequences of their foolishness might not be particularly serious. Spinoza, by contrast, wants to liberate his readers from their ignorance in order to bring them closer to salvation.

So Spinoza is not simply an atheist and a critic of religion, nor a sceptical agnostic. On the contrary, he places a certain conception of God at the heart of his philosophy, and he describes the ideal human life as one devoted to love of this God. Moreover, while Spinoza is critical of superstition, he is sympathetic to some aspects of Jewish and Christian teaching. In particular, he argues that Jesus had a singularly direct and immediate understanding of God, and that it is therefore right to see him as the embodiment of truth, and a role model for all human beings.

Spinoza, part 4: All there is, is God

Being infinite and eternal, God has no boundaries, argues Spinoza, and everything in the world must exist within this God

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 28 February 2011 10.00 GMT
o larger | smaller
o Article history

So far in this series I've focused on Spinoza's critique of the religious and philosophical world view of his time. But what does he propose in place of anthropomorphic, anthropocentric belief in a transcendent creator God?

Spinoza begins his Ethics by defining some basic philosophical terms: substance, attribute, and mode. In offering these definitions, he is actually attempting a radical revision of the philosophical vocabulary used by Descartes, the leading thinker of his time, to conceptualise reality. When we understand these terms properly, argues Spinoza, we have to conclude that there exists only one substance – and that this is God.

Substance is a logical category that signifies independent existence: as Spinoza puts it, "by substance I understand what is conceived through itself". By contrast, attributes and modes are properties of a substance, and are therefore logically dependent on this substance. For example, we might regard a particular body as a substance, and this body is not conceptually dependent on anything else. But the body's properties, such as its weight and its colour and its shape, are qualities that cannot be conceived to exist in isolation: they must be the weight, colour and shape of a certain body.

Descartes's world view draws on Aristotelian metaphysics and scholastic theology in conceiving individual entities as distinct substances. Human beings, for example, are finite substances, while God is a special substance which is infinite and eternal. In fact, Descartes thought that each human being was composed of two substances: a mind, which has the principal attribute of thought; and a body, which has the principal attribute of extension, or physicality. This view famously leads to the difficult question of how these different substances could interact, known as the "mind-body problem".

The philosophical terminology of substance, attribute and mode makes all this sound rather technical and abstract. But Cartesian metaphysics represents a way of thinking about the world, and also about ourselves, shared by most ordinary people. We see our world as populated by discrete objects, individual things – this person over here, that person over there; this computer on the table; that tree outside, and the squirrel climbing its trunk; and so on. These individual beings have their own characteristics, or properties: size, shape, colour, etc. They might be hot or cold, quiet or noisy, still or in motion, and such qualities can be more or less changeable. This way of conceptualising reality is reflected in the structure of language: nouns say what things are, adjectives describe how they are, and verbs indicate their actions, movements and changing states. The familiar distinction between nouns, adjectives and verbs provides an approximate guide to the philosophical concepts of substance, mode and attribute.

If, as Spinoza argues, there is only one substance – God – which is infinite, then there can be nothing outside or separate from this God. Precisely because God is a limitless, boundless totality, he must be an outsideless whole, and therefore everything else that exists must be within God. Of course, these finite beings can be distinguished from God, and also from one another – just as we can distinguish between a tree and its green colour, and between the colour green and the colour blue. But we are not dealing here with the distinction between separate substances that can be conceived to exist independently from one another.

Again, this is rather abstract. As Aristotle suggested, we cannot think without images, and I find it helpful to use the image of the sea to grasp Spinoza's metaphysics. The ocean stands for God, the sole substance, and individual beings are like waves – which are modes of the sea. Each wave has its own shape that it holds for a certain time, but the wave is not separate from the sea and cannot be conceived to exist independently of it. Of course, this is only a metaphor; unlike an infinite God, an ocean has boundaries, and moreover the image of the sea represents God only in the attribute of extension. But maybe we can also imagine the mind of God – that is to say, the infinite totality of thinking – as like the sea, and the thoughts of finite beings as like waves that arise and then pass away.

Spinoza's world view brings to the fore two features of life: dependence and connectedness. Each wave is dependent on the sea, and because it is part of the sea it is connected to every other wave. The movements of one wave will influence all the rest. Likewise, each being is dependent on God, and as a part of God it is connected to every other being. As we move about and act in the world, we affect others, and we are in turn affected by everything we come into contact with.

This basic insight gives Spinoza's philosophy its religious and ethical character. In traditional religion, dependence and connectedness are often expressed using the metaphor of the family: there is a holy father, and in some cases a holy mother; and members of the community describe themselves as brothers and sisters. This vocabulary is shared by traditions as culturally diverse as Christianity, Buddhism and Islam. For Spinoza, the familial metaphor communicates a truth that can also be conveyed philosophically – through reason rather than through an image.

Spinoza, part 5: On human nature

We are not autonomous individuals but part of a greater whole, says Spinoza, and there is no such thing as human free will

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 7 March 2011 09.00 GMT
o larger | smaller
o Article history

Last week, we examined Spinoza's metaphysics, looking at how his radical reinterpretation of the philosophical terminology of substance, attribute and mode produces a new vision of reality. According to Spinoza, only God can be called a substance – that is to say, an independently existing being – and everything else is a mode of this single substance. But what does this mean for us?

One of the central questions of philosophy is: what is a human being? And this question can be posed in a more personal way: who am I? As we might by now expect, Spinoza's view of the human being challenges commonsense opinions as well as prevailing philosophical and religious ideas. We are probably inclined to think of ourselves as distinct individuals, separate from other beings. Of course, we know that we have relationships to people and objects in the world, but nevertheless we see ourselves as autonomous – a view that is reflected in the widelyheld belief that we have free will. This popular understanding of the human condition is reflected in Cartesian philosophy, which conceives human beings as substances. In fact, Descartes thought that human beings are composed of two distinct substances: a mind and a body.

For Spinoza, however, human beings are not substances, but finite modes. (Last week, I suggested that a mode is something like a wave on the sea, being a dependent, transient part of a far greater whole.) This mode has two aspects, or attributes: extension, or physical embodiment; and thought, or thinking. Crucially, Spinoza denies that there can be any causal or logical relationships across these attributes. Instead, he argues that each attribute constitutes a causal and logical order that fully expresses reality in a certain way. So a human body is a physical organism which expresses the essence of that particular being under the attribute of extension. And a human mind is an intellectual whole that expresses this same essence under the attribute of thinking.

But this is not to suggest that the mind and the body are separate entities – for this would be to fall back into the Cartesian view that they are substances. On the contrary, says Spinoza, mind and body are two aspects of a single reality, like two sides of a coin. "The mind and the body are one and the same individual, which is conceived now under the attribute of thought, now under the attribute of extension," he writes in book two of the Ethics. And for this reason, there is an exact correspondence between them: "The order and connection of ideas is the same as the order and connection of things." In fact, each human mind involves awareness of a human body.

This way of thinking has some important consequences. One of the most obvious is that it undermines dualistic and reductionist accounts of the human being. Descartes's mind-body dualism involves the claim that we are, in essence, thinking beings – that the intellectual should be privileged above the physical, reason above the body. Conversely, modern science often regards the human being as primarily a physical entity, and attempts to reduce mental activity to physical processes. In Spinoza's view, however, it is incoherent to attempt to explain the mental in terms of the physical, or vice versa, because thinking and extension are distinct explanatory orders. They offer two alternative ways of describing and understanding our world, and ourselves, which are equally complete and equally legitimate.

Another important consequence of Spinoza's account of the human being is his denial of free will. If we are modes rather than substances, then we cannot be self-determining. The human body is part of a network of physical causality, and the human mind is part of a network of logical relations. In other words, both our bodily movements and our thinking are constrained by certain laws. Just as we cannot defeat the law of gravity, so we cannot think that 2 + 2 = 5, or that a triangle has four sides.

Spinoza's criticism of the popular belief in free will is rather similar to his analysis of belief in miracles in the Theologico-Political Treatise, which we looked at a few weeks ago. There, we may recall, he argued that people regard events as miraculous and supernatural when they are ignorant of their natural causes. Likewise, human actions are attributed to free will when their causes are unknown: "That human freedom which all men boast of possessing … consists solely in this, that men are conscious of their desire and unaware of the causes by which they are determined." For Spinoza, belief in free will is just as much a sign of ignorance and superstition as belief in miracles worked by divine intervention.