Search This Blog

Tuesday, 7 June 2016

Voters believe that even if they did exercise their right to leave the EU, the politicians wouldn’t obey them.



Aditya Chakrabortty in The Guardian


 
‘Voters believe that even if they did exercise their right to leave the EU, the politicians wouldn’t obey them.’ Illustration by Matt Kenyon


Neil was speckled with paint from his trousers to his spectacle lenses, and had come straight from work to the vape shop. When I asked which side he’d be backing in the EU referendum, he projected as if addressing a rally. He wanted everyone to know he was damned if he was going to vote. “It’s an illusion that we’ve got a say in it. We don’t live in a democracy. The day of having a common working man standing for us here or in Europe – it’s over.”

We were in Pontypool, south Wales. As a valleys boy (“I smell of sheep”), Neil had been brought up Labour. But now, “It’s all lies, isn’t it?” Then came the sins: Blair “a big liar”; the political class in the pockets of the bankers.

It was the refrain I kept getting last week across south Wales – and have heard in many other regions too. That dissolution in old loyalties, that breakdown in trust, runs wide and deep – and it already marks the referendum on 23 June. Opinion polls show that voters believe that even if they did exercise their right to leave the EU, the politicians wouldn’t obey them. This is what a democratic crisis looks like.

Although journalists often remind us that this is the first vote the British have had on Europe in 40 years, they rarely dwell on what happened last time round. Yet the torchlight of history shows just how much has changed. While today’s polls show leave and remain neck and neck, the 1975 referendum on whether Britain should stay in the European Economic Community was as good as won before it was even announced. The then prime minister, Harold Wilson, led a coalition of the establishment – all three parties, the unions, the business lobbies, the press – and emerged with a 2:1 majority to stay in.

Europhoria” is how the Guardian reported the results. Its leader began: “Full-hearted, wholehearted and cheerful hearted: there is no doubt about the ‘yes’.” Imagine anything even close to that being said in two weeks’ time – after months of sullen and sour campaigning, of close colleagues branding one another “liars”, “luxury-lifestyle” politicians and “Pinocchio”.

“Wilson would never have asked a question of which he couldn’t be confident of the answer,” says historian Adrian Williamson. Contrast that with David Cameron, who once claimed he wanted to be prime minister because he’d be “rather good at it”, but now resembles a short-tempered supply teacher struggling to control his own class.

Panicked by a fear of Nigel Farage and the ultras in his own party, the Tory leader has staged a referendum for which there was little public appetite and which he may now, incredibly, lose.

Months were spent trailing a deal that the prime minister was going to strike with Germany’s Angela Merkel and the rest – a rewriting of the rules that was going to form the basis of this referendum. You’ve barely heard about that deal since.
Posed a question few of them were actually asking, voters have wound up raising their own. Why haven’t my wages gone up? How will the kids get on the housing ladder? When will my mum get her knee replacement? All good questions, none of which are actually on the ballot paper. The likely result is that on 23 June, many of those who do vote will try to squeeze a multitude of other answers into one crude binary.

In 1975 Roy Jenkins, another son of Welsh coal and steel, explained the result as: “The people took the advice of people they were used to following.” Classic Jenkins, but also an expression of the classic role of mass political parties. When they had millions of members, both Labour and the Conservatives served as the brokers between the people they represented and the “experts”, the authoritative midpoint between ideology and empirics.

Neither party can claim to be mass any more, least of all the Tories – low on members, bankrolled by hedge funds and the City. This creates what Chris Bickerton, politics lecturer at Cambridge, calls “the crisis of political mediation”.

No longer claiming the same democratic legitimacy as their predecessors, Cameron and George Osborne have had to borrow their authority from other sources: Mark Carney and the Bank of England, the International Monetary Fund, the Treasury. These technocrats, much cited by broadcasters and jittery remainers, are one of the two main sources of authority in our democracy. The other is the post-truth brigade, as channelled by Boris Johnson and Michael Gove, who advise voters to ignore the nuance, trust their gut – and blame migrants or the Brussels fatcats.

British democracy in 2016 comes down to this: a prime minister can no longer come out and say something and expect to be believed. He or she must wheel out a common room-full of experts. He or she can expect to be called a liar in the press and by their colleagues. He or she can only hope that some of what they say resonates with an electorate that has tuned them out.

And mainstream politicians have only themselves to blame. Over the past three decades, Britons have been made a series of false promises. They have been told they must go to war with a country that can bomb them in 45 minutes – only to learn later that that was false. They have been assured the economy was booming, only to find out it was fuelled by house prices and tax credits.

New Labour pledged an end to Margaret Thatcher’s unfairness, except that – as the Centre for Research into Socio-Cultural Change has shown – the richest 20% of households scooped as much of the income growth under Brown and Blair as they had under the Iron Lady.

Britons were told austerity would last five years, tops – although we will now endure at least a decade.

And the people of south Wales were told new industries would replace the coal and steelworks. Looking out of the shop window, Neil remembered how Pontypool on market days like today would be “rammed”. Now it was half-empty. “It’s dead now, because they took what they wanted,” he said. “Thatcher smashed the unions. There used to be coalmines all around here. Boosh – we’re out of here. They’ve moved on.”

Cameron and the rest of the political class are learning a lesson the hard way. You can only break your promises to the public so many times before they refuse to put any more trust in you. After that, you have to rely on Threadneedle Street and the Treasury to corrode their own finite reputation for impartiality.

Whichever way the ballots go on 23 June, the public will continue returning a vote of no-confidence in Westminster for a long time to come.

Monday, 6 June 2016

When left alone in a room, people preferred to give themselves electric shocks than quietly sit and think. Why do smart people do stupid things?

Andre Spicer in The Guardian

Thinking is hard work and asking tough questions can make you unpopular. So it’s no wonder that even clever people don’t always use their brains

Scene from The Big Bang Theory: ‘Having a high IQ score does not mean that someone is intelligent.’ Photograph: CBS/Everett/Rex


We all know smart people who do stupid things. At work we see people with brilliant minds make the most simple mistakes. At home we might live with someone who is intellectually gifted but also has no idea. We all have friends who have impressive IQs but lack basic common sense.

For more than a decade, Mats Alvesson and I have been studying smart organisations employing smarter people. We were constantly surprised by the ways that these intelligent people ended up doing the most unintelligent things. We found mature adults enthusiastically participating in leadership development workshops that wouldn’t be out of place in a pre-school class; executives who paid more attention to overhead slides than to careful analysis; senior officers in the armed forces who preferred to run rebranding exercises than military exercises; headteachers who were more interested in creating strategies than educating students; engineers who focused more on telling good news stories than solving problems; and healthcare workers who spent more time ticking boxes than caring for patients. No wonder so many of these intelligent people described their jobs as being dumb.

While doing this research I realised that my own life was also blighted with stupidities. At work I would spend years writing a scientific paper that only a dozen people would read. I would set exams to test students on knowledge I knew they would forget as soon as they walked out of the examination room. I spent large chunks of my days sitting in meetings which everyone present knew were entirely pointless. My personal life was worse. I’m the kind of person who frequently ends up paying the “idiot taxes” levied on us by companies and governments for not thinking ahead.

Clearly I had a personal interest in trying to work out why I, and millions of others like me, could be so stupid so much of the time. After looking back at my own experiences and reading the rapidly growing body of work on why humans fail to think, my co-author and I started to come to some conclusions.

Having a high IQ score does not mean that someone is intelligent. IQ tests only capture analytical intelligence; this is the ability to notice patterns and solve analytical problems. Most standard IQ tests miss out two other aspects of human intelligence: creative and practical intelligence. Creative intelligence is our ability to deal with novel situations. Practical intelligence is our ability to get things done. For the first 20 years of life, people are rewarded for their analytical intelligence. Then we wonder why the “best and brightest” are uncreative and practically useless.

Most intelligent people make mental short cuts all the time. One of the most powerful is self-serving bias: we tend to think we are better than others. Most people think they are above average drivers. If you ask a class of students whether they are above the class average in intelligence, the vast majority of hands shoot up. Even when you ask people who are objectively among the worst in a certain skill, they tend to say they are above average. Not everyone can be above average – but we can all have the illusion that we are. We desperately cling to this illusion even when there is devastating evidence to the contrary. We collect all the information we can find to prove ourselves right and ignore any information that proves us wrong. We feel good, but we overlook crucial facts. As a result the smartest people ignore the intelligence of others so they make themselves feel smarter.

Being smart can come at a cost. Asking tricky questions, doing the research and carefully thinking things through takes time. It’s also unpleasant. Most of us would rather do anything than think. A recent study found that when left alone in a room, people preferred to give themselves electric shocks than quietly sit and think. Being smart can also upset people. Asking tough questions can quickly make you unpopular.

Intelligent people quickly learn these lessons. Instead of using their intelligence, they just stay quiet and follow the crowd – even if it is off the side of a cliff. In the short term this pays off. Things get done, everyone’s lives are easier and people are happy. But in the long term it can create poor decisions and lay the foundations for disaster.

Next time I find myself banging my own head and asking myself “Why are you so stupid?”, I will try to remind myself that I’m trapped in the same situation as many millions of others: my own idiocy probably came with a payoff.

Sunday, 5 June 2016

Why we need a left exit from Fortress Europe

Tariq Ali

Many names for a PhD

Chidanand Rajghatta in the Times of India


Pathetic homeless dork. Patiently hoping for degree. Professor had doubts. These are some expansions for the much-vaunted acronym PhD, formally a Doctor of Philosophy no matter what one's subject of research and expertise. Regarded as the acme of scholarship, it stands at a rarefied academic height that takes immense effort and time to reach. Probably heavily in Debt, please hire -- desperate, and patently headed downhill are some of the other self-deprecating expansions doctoral candidates throw out to explain their striving.

Which one of these gloomy explanations applied to IIT-Stanford alumnus Mainak Sarkar as he lost it one fine morning last week is hard to say. Perhaps all. He loaded up two guns he had purchased, broke into the home in Minnesota of his estranged wife Ashley and shot her dead. He then drove 2500km to snuff out the life of his PhD adviser and professor William Klug at the University of California (UCLA) in Los Angeles.

Mainak Sarkar met every gratuitous grad student putdown -- to a high degree.

The life of an Indian PhD scholars in the US centers around ''adviser and Budweiser,''goes an old joke in desi circles. They are generally regarded as a quiet, reticent, insular, and industrious lot, who tread the straight path between lab, library, home, and an occasional beer. American universities covet them because of their undemanding and non-confrontational nature, and the fealty and value they bring to the program. Often socially awkward and taciturn, many work doubly hard and wrap up their degree in quick time.

From all accounts, Sarkar conformed to the mould. Hailing from a modest Bengali family from Durgapur, where his father was a clerk in a cement factory, he was said to be a bright student in school. Accounts from his college years suggest he was introverted. After a bachelor's degree from IIT Kharagpur in 2000, he worked briefly at Infosys in Bangalore before heading out to Stanford for his master's, a route taken by many Indians, notably Google's Sundar Pichai.

But while IITian titans such as Vinod Khosla and Pichai chose to do an MBA after their masters, setting the template for what many of today's US-bound Indian engineers do, Sarkar opted for the road less travelled these days because of the toil and hardship involved: A PhD program -- at UCLA's Henri Samueli School of Engineering.

PhD programs can be brutal. In fact, such is the struggle involved in earning a doctorate that a dedicated satirical strip called PhD comics by former grad student Jorge Cham, which follows the lives of several doctoral students, is a must-read for the PhD crowd. PhD in this instance stands for Piled Higher and Deeper, a degree that follows BS (Bull Shit) and MS (More of the Same). From the difficulties of research to the complex student-adviser equation, Cham explores the exhausting grind of the indigent PhD scholar, from slumming it out in deadbeat digs to the perpetual search for free food.

Central to the strip is the tortured time-span of a PhD program that appears to be interminable. One brilliant strip shows a fresh PhD candidate in his first year announce to the world ''Here I come!'' with visions of winning the Nobel Prize, and in the second year, revolutionizing the field. By the third year, he is reduced to hoping he'll get a job in the university, and by the fourth year, just get any job, anywhere. By the fifth year, he's just hoping to attend some conference in Podunk, Minnesota, and wishing they will lay out pepperoni pizza.

Mainak Sarkar's struggle to earn his doctorate extended to at least eight years, in part because he was locked in a grad student's ultimate nightmare: an adversarial relationship with the guide/mentor/supervisor. ''You can't even begin to describe the sense of gloom and doom,'' one grad student who has been through the mill explained. ''And it gets worse as people who joined the program after you graduate before you, and you are still there, hanging on in quiet desperation.''

The situation has gotten worse in recent years with US universities awarding doctoral degrees at an accelerating pace (nearly 60,000 annually), despite the economic downturn proving a dampener for the career prospects of those who graduate. According to one recent study, less than 17% of new PhDs in science, engineering and health-related fields find tenure-track positions within three years after graduation. Stress levels are high and fear and frustration are endemic at the prospect of seeing a lifetime of study not pay off.

Already 38, Sarkar struggled to make a living after a tortured academic career culminating in a PhD that was grudgingly granted to him in 2013 (earlier versions of this story said he was a grad student into his tenth year, but Klug colleagues have said they graduated him in 2013 despite his subpar thesis mainly to get rid of him). He then took up a job in Ohio, working remotely as an engineering analyst. For reasons unknown, it did not last long. Nor did his marriage.

It is not clear what role his personal turmoil played in the deterioration on the academic front or whether it was the other way around. But early this year Sarkar started ranting online about his adviser Klug, accusing him of stealing his code and passing it to other students.

Such intellectual property spats are not uncommon in the doctoral research world, though the UCLA Engineering School, named after its professor Henri Samueli, is a standout example of an ideal mentor-mentee relationship. Samueli and his PhD student Henry Nicholas founded Broadcom, a chip company that topped $150 billion in market cap at the height of the dotcom bubble and sold for $37 billion a couple of years ago. It is arguably the most successful teacher-student collaboration in history in financial terms.

The Sarkar-Klug ties didn't follow the script. It ended in death - a murder-suicide that took them both to a different PhD: A premature and horrible Death.