Search This Blog

Showing posts with label Nobel. Show all posts
Showing posts with label Nobel. Show all posts

Sunday 1 October 2017

The pendulum swings against privatisation

Evidence suggests that ending state ownership works in some markets but not others


Tim Harford in The Financial Times


Political fashions can change quickly, as a glance at almost any western democracy will tell you. The pendulum of the politically possible swings back and forth. Nowhere is this more obvious than in the debates over privatisation and nationalisation. 


In the late 1940s, experts advocated nationalisation on a scale hard to imagine today. Arthur Lewis thought the government should run the phone system, insurance and the car industry. James Meade wanted to socialise iron, steel and chemicals; both men later won Nobel memorial prizes in economics. 

They were in tune with the times: the British government ended up owning not only utilities and heavy industry but airlines, travel agents and even the removal company, Pickfords. The pendulum swung back in the 1980s and early 1990s, as Margaret Thatcher and John Major began an ever more ambitious series of privatisations, concluding with water, electricity and the railways. The world watched, and often followed suit. 

Was it all worth it? The question arises because the pendulum is swinging back again: Jeremy Corbyn, the bookies’ favourite to be the next UK prime minister, wants to renationalise the railways, electricity, water and gas. (He has not yet mentioned Pickfords.) Furthermore, he cites these ambitions as a reason to withdraw from the European single market. 

Privatisation’s proponents mention the galvanising effect of the profit motive, or the entrepreneurial spirit of private enterprise. Opponents talk of fat cats and selling off the family silver 

That is odd, since there is nothing in single market rules to prevent state ownership of railways and utilities — the excuse seems to be yet another Eurosceptic myth, the leftwing reflection of rightwing tabloids moaning about banana regulation. Since the entire British political class has lost its mind over Brexit, it would be unfair to single out Mr Corbyn on those grounds. 

Still, he has reopened a debate that long seemed settled, and piqued my interest. Did privatisation work? Proponents sometimes mention the galvanising effect of the profit motive, or the entrepreneurial spirit of private enterprise. Opponents talk of fat cats and selling off the family silver. Realists might prefer to look at the evidence, and the ambitious UK programme has delivered plenty of that over the years. 

There is no reason for a government to own Pickfords, but the calculus of privatisation is more subtle when it comes to natural monopolies — markets that are broadly immune to competition. If I am not satisfied with what Pickford’s has to offer me when I move home, I am not short of options. But the same is not true of the Royal Mail: if I want to write to my MP then the big red pillar box at the end of the street is really the only game in town. 

Competition does sometimes emerge in unlikely seeming circumstances. British Telecom seemed to have an iron grip on telephone services in the UK — as did AT&T in the US. The grip melted away in the face of regulation and, more importantly, technological change. 

Railways seem like a natural monopoly, yet there are two separate railway lines from my home town of Oxford into London, and two separate railway companies will sell me tickets for the journey. They compete with two bus companies; competition can sometimes seem irrepressible. 

But the truth is that competition has often failed to bloom, even when one might have expected it. If I run a bus service at 20 and 50 minutes past the hour, then a competitor can grab my business without competing on price by running a service at 19 and 49 minutes past the hour. Customers will not be well served by that. 

Meanwhile electricity and phone companies offer bewildering tariffs, and it is hard to see how water companies will ever truly compete with each other; the logic of geography suggests otherwise. 

All this matters because the broad lesson of the great privatisation experiment is that it has worked well when competition has been unleashed, but less well when a government-run business has been replaced by a government-regulated monopoly. 

A few years ago, the economist David Parker assembled a survey of post-privatisation performance studies. The most striking thing is the diversity of results. Sometimes productivity soared. Sometimes investors and managers skimmed off all the cream. Revealingly, performance often leapt in the year or two before privatisation, suggesting that state-owned enterprises could be well-run when the political will existed — but that political will was often absent. 

My overall reading of the evidence is that privatisation tended to improve profitability, productivity and pricing — but the gains were neither vast nor guaranteed. Electricity privatisation was a success; water privatisation was a disappointment. Privatised railways now serve vastly more passengers than British Rail did. That is a success story but it looks like a failure every time your nose is crushed up against someone’s armpit on the 18:09 from London Victoria. 

The evidence suggests this conclusion: the picture is mixed, the details matter, and you can get results if you get the execution right. Our politicians offer a different conclusion: the picture is stark, the details are irrelevant, and we metaphorically execute not our policies but our opponents. The pendulum swings — but shows no sign of pausing in the centre.

Saturday 15 October 2016

Bob Dylan - Literature, unplugged

Vaibhav Sharma in The Hindu

In the space of two years, the Nobel has shed decades of conservatism, and twice redefined what it considers, and what we must consider, ‘literature’


In 1997, Eric Zorn, a columnist in the Chicago Tribune, advocated for Bob Dylan to be awarded the Nobel Prize. “And though it's likely that snobbery will forever doom the chances of a folk-rock musician to join the roster of past winners that includes such literary giants as William Faulkner, Ernest Hemingway, John Steinbeck, Saul Bellow and Toni Morrison,” Zorn wrote, “the truth is that, for multi-faceted talent with language and sustained international impact, few if any living writers are Dylan's equal.”

A century earlier, in 1896, when the literature prize’s founding charter was read out from Alfred Nobel’s will, it recommended the award be conferred on the “person who shall have produced in the field of literature the most outstanding work in an ideal direction.”

Charter from another age


Alfred Nobel’s directive was formed in a time when the nineteenth century forms of the novel and the short story, along with the classical mediums of poetry and drama, constituted the zenith of literary expression. Nobel’s charter could not have imagined how these forms would remain significant and robust, but steadily become inadequate in representing the whole of lived experience in the twentieth century, the most violent in human history.

As the Nobel approached its centenary, around the time of Zorn’s plea to award Dylan the literature prize, it was clear that novels, poems, short stories and plays were not the sole expressions of literary prestige and value, but part of a wider constellation which included nonfiction reportage, narrative history and biography, academic treatises such as Edward Said’s Orientalism and — as acknowledged by Dylan’s award — the great tradition of songwriting, coming of age in the radical tumult of the 1960s.

But, as recently as two years ago, there lingered the sense that the Nobel remained, to its detriment, too faithful to its founding charter and strangely reluctant to recognise the varied art forms that so powerfully enhanced our understanding of the modern age. For every inspired choice, such as J.M. Coetzee or Mo Yan, there was a J.M.G. Le Clézio and a Patrick Modiano, which was evidence of a wearing retreat into a provincial, post-war European vision, one curiously at odds with the epoch being lived by the vast majority of the world’s citizens, of technological innovation, ever-imaginative forms of state terror and modern, industrial forms of violence and devastation.

It seemed the Nobel committee was reluctant to recognise that Europe was no longer the centre of economic and intellectual ferment, that countries such as India and China would shape the destiny of our still-nascent century far more than the Old Continent. Yet in an era of Europe’s rapidly declining significance to the world at large, 15 of the past 20 Laureates (before Dylan) were European. In 2008, in a statement that might have been true five decades previously, Horace Engdahl, the then-permanent secretary of the Nobel committee, said, “Europe is still the centre of the literary world.”

However, awarding the prize to Dylan, and last year to the Belarusian journalist Svetlana Alexievich, allow us to tentatively suggest that the Nobel’s horizons, at last, may be becoming more expansive and modern.

The prize to Alexievich, a worthy successor to the great Polish journalist Ryszard Kapuscinski, gave a clue to the Nobel committee’s changing priorities. In a piece in the New Yorker, ‘Nonfiction Wins a Nobel,’ the writer Philip Gourevitch quoted from one of Alexievich’s essays in which she declared that “art has failed to understand many things about people.” Alexievich argued that, in our present age, “when man and the world have become so multifaceted and diversified,” journalistic documentation remained the best way of representing reality, while “art as such often proves impotent.”

Capturing our age

In his piece, Gourevitch narrated another fascinating exchange at the PEN World Voices Festival in New York where Alexievich stated: “I’d like to remember the great Chekhov, and his play ‘The Three Sisters.’ The main character in that play says over and over, ‘Now life is terrible, we live in squalor, but in a hundred years, a hundred years, how beautiful, how fine everything will be.’ And what has happened a hundred years later? We have Chernobyl; we have the World Trade Towers collapsing. It’s a new age in history. What we have experienced now not only goes beyond our knowledge but also exceeds our ability to imagine.”

Alexievich’s prize was, in a sense, the Nobel committee’s acknowledgement of a long-overdue corrective. Dylan’s award furthers that process, as if the Nobel committee was hastily making amends for the decidedly narrow prism with which it viewed the artistic and cultural ferment of the past half-century. In the space of two years, the Nobel seems to have shed decades of conservatism, and twice redefined what it considers — and what we must consider — ‘literature’.

It is also a powerful reinforcement of the oral tradition, the primary method of literary dissemination through the centuries, before the onslaught of print capitalism in the West began relegating it to the margins from the eighteenth century onwards. Salman Rushdie, delighted by Dylan’s prize, told theGuardian: “The frontiers of literature keep widening and it’s exciting that the Nobel prize recognises that.” What a blow for diversity of literary forms that, to access the latest Laureate’s work, we had to go to iTunes instead of Amazon.

Dylan’s award may be something we may never see repeated, for he is a truly singular figure: a prophetic bard whose songs contained the force of immediacy, but were simultaneously universal and timeless. Some of the best music critics of our time, such as Alex Ross and David Hajdu, have written of Dylan’s dexterity and towering influence across genres, which include blues, folk and rock-and-roll. The Nobel committee said they were giving the prize to Dylan as “a great poet in the great English tradition, stretching from Milton and Blake onwards.” Some have even interpreted it as a lofty rebuke to the sleazy, dismaying political climate in the age of Trump.

Of equal relevance to the world of letters at large may be Dylan’s stubborn refusal to become a pamphleteer and an easy vehicle for the partisan political passions of his age. A seer born of the counterculture of the 1950s and ’60s, Dylan yet remained sceptical of the evangelist temper of anti-establishment politics and the constricting nature of political categories, animated by an Orwellian distrust of Utopias and wary of the artistic perils of political allegiance.

‘A song and dance man’
In his farsighted suspicion of all “isms” that ravaged the twentieth century, and in his demurral to be a spokesman for anything at all, Dylan’s life has been a compelling case for an inalienable devotion to the integrity and autonomy of the artist. Perhaps there has been no greater, and simpler, expression of artistic independence than Dylan’s declaration that “I am just a song and dance man.”

No composer or songwriter is likely to win the prize again for a long while, but Dylan’s prize is significant for it heralds the Literature Nobel’s belated transition into the modern age. Zorn, the columnist in the Chicago Tribune, triumphantly noted that nineteen years too late, Dylan had finally got what he deserved. There are more correctives for the prize to make, such as overcoming the still dominant spell of Eurocentrism. But the literature prize, conceived in the nineteenth century, finally seems to be embracing the twenty-first.

Wednesday 12 October 2016

Nobel prize winners’ research worked out a theory on worker productivity – then Amazon and Deliveroo proved it wrong

Ben Chu in The Independent


Financial incentives are important. We all know that’s true. If you were offered a job that paid £10 an hour and then someone else came up offering to pay you £11 an hour for identical work, which one would you choose?

Most of us would also accept that well-designed employment contracts can get more out of us. If we could take home more money for working harder (or more effectively), most of us would.

Bengt Holmstrom won the Nobel economics prize this week for his theoretical research on the optimum design for a worker’s contract to encourage the individual to work as productively as possible.

The work of Holmstrom and his fellow Nobel laureate, Oliver Hart, is subtle, recognising that the complexity of the world can cause simplistic piece-rate contracts or bonus systems to yield undesirable results.

For instance, if you pay teachers more based on exam results, you will find they “teach to the test” and neglect other important aspects of children’s education. If you reward CEOs primarily based on the firm’s share price performance you will find that they focus on boosting the short-term share price, rather than investing for the long-term health of the company.

Holmstrom and Hart also grappled with the problem of imperfect information. It is hard to measure an individual worker’s productivity, particularly when they are engaged in complex tasks.

So how can you design a contract based on individual performance? Holmstrom’s answer was that where measurement is impossible, or very difficult, pay contracts should be biased towards a fixed salary rather than variable payment for performance.

Yet when information on an employee’s performance is close to perfect, there can also be problems.

The information problem seems to be on the way to resolution in parts of the low-skill economy. Digital technology allows much closer monitoring of workers’ performance than in the past. Pickers at Amazon’s Swansea warehouse are issued with personal satnav computers which direct them around the giant warehouse on the most efficient routes, telling them which goods to collect and place in their trolleys. The devices also monitor the workers’ productivity in real time – and those that don’t make the required output targets are “released” by the management.

The so-called “gig economy” is at the forefront of what some are labelling “management by algorithm”. The London-founded cycling food delivery service app Deliveroo recently tried to implement a new pay scale for riders. The company’s London boss said this new system based on fees per delivery would increase pay for the most efficient riders. UberEats – Uber's own meal delivery service – attempted something similar.

Yet the digital productivity revolution is encountering some resistance. The proposed changes by UberEats and Deliveroo provoked strikes from their workers. And there is a backlash against Amazon’s treatment of warehouse workers.

It is possible that some of this friction is as much about employment status as contract design and pay rates. One of the complaints of the UberEats and Deliveroo couriers is that they are not treated like employees at all.

It may also reflect the current state of the labour market. If people don’t want to work in inhuman warehouses or for demanding technology companies, why don’t they take a job somewhere else? But if there are not enough jobs in a particular region, people may have no choice. The employment rate is at an all-time high, but there’s still statistical evidence that many workers would like more hours if they could get them.

Yet the new technology does pose tough questions about worker treatment. And there is no reason why these techniques of digital monitoring of employees should be confined to the gig economy or low-skill warehouse jobs.

One US tech firm called Percolata installs sensors in shops that measure the volume of customers and then compare that with the sales per employee. This allows managements to make a statistical adjustment for the fact that different shops have different customer footfall rates – it fills in the old information blanks. The result is a closer reading of an individual shop worker’s productivity.

Workers who do better can be awarded with more hours. “It creates this competitive spirit – if I want more hours, I need to step it up a bit,” Percolata’s boss told the Financial Times.

It’s possible to envisage these kinds of digital monitoring techniques and calculations being rolled out in a host of jobs and bosses making pay decisions on the basis of detailed productivity data. But one doesn’t have to be a neo-Luddite to feel uncomfortable with these trends. It’s not simply the potential for tracking mistakes by the computers and flawed statistical adjustments that is problematic, but the issue of how this could transform the nature of the workspace.

Financial incentives matter, yet there is rather more to the relationship between a worker and employer than a pay cheque. Factors such as trust, respect and a sense of common endeavour matter too – and can be important motivators of effort.

If technology meant we could design employment contracts whereby every single worker was paid exactly according to his or her individual productivity, it would not follow that we necessarily should.

Monday 12 October 2015

Don’t let the Nobel prize fool you. Economics is not a science

The award glorifies economists as tellers of timeless truths, fostering hubris and leading to disaster

Joris Luyendijk in The Guardian


 
‘A Nobel prize in economics implies that the human world operates much like the physical world.’ Photograph: Jasper Rietman


Business as usual. That will be the implicit message when the Sveriges Riksbank announces this year’s winner of the “Prize in Economic Sciences in Memory of Alfred Nobel”, to give it its full title. Seven years ago this autumn, practically the entire mainstream economics profession was caught off guard by the global financial crash and the “worst panic since the 1930s” that followed. And yet on Monday the glorification of economics as a scientific field on a par with physics, chemistry and medicine will continue.

The problem is not so much that there is a Nobel prize in economics, but that there are no equivalent prizes in psychology, sociology, anthropology. Economics, this seems to say, is not a social science but an exact one, like physics or chemistry – a distinction that not only encourages hubris among economists but also changes the way we think about the economy.

A Nobel prize in economics implies that the human world operates much like the physical world: that it can be described and understood in neutral terms, and that it lends itself to modelling, like chemical reactions or the movement of the stars. It creates the impression that economists are not in the business of constructing inherently imperfect theories, but of discovering timeless truths.



Economist Sir Richard Blundell among Nobel prize frontrunners


To illustrate just how dangerous that kind of belief can be, one only need to consider the fate of Long-Term Capital Management, a hedge fund set up by, among others, the economists Myron Scholes and Robert Merton in 1994. With their work on derivatives, Scholes and Merton seemed to have hit on a formula that yielded a safe but lucrative trading strategy. In 1997 they were awarded the Nobel prize. A year later, Long-Term Capital Management lost $4.6bn (£3bn)in less than four months; a bailout was required to avert the threat to the global financial system. Markets, it seemed, didn’t always behave like scientific models.

In the decade that followed, the same over-confidence in the power and wisdom of financial models bred a disastrous culture of complacency, ending in the 2008 crash. Why should bankers ask themselves if a lucrative new complex financial product is safe when the models tell them it is? Why give regulators real power when models can do their work for them?

Many economists seem to have come to think of their field in scientific terms: a body of incrementally growing objective knowledge. Over the past decades mainstream economics in universities has become increasingly mathematical, focusing on complex statistical analyses and modelling to the detriment of the observation of reality.

Consider this throwaway line from the former top regulator and London School of Economics director Howard Davies in his 2010 book The Financial Crisis: Who Is to Blame?: “There is a lack of real-life research on trading floors themselves.” To which one might say: well, yes, so how about doing something about that? After all, Davies was at the time heading what is probably the most prestigious institution for economics research in Europe, located a stone’s throw away from the banks that blew up.

 Howard Davies, pictured in 2006. Photograph: Eamonn McCabe for the Guardian

All those banks have “structured products approval committees”, where a team of banking staff sits down to decide whether their bank should adopt a particular new complex financial product. If economics were a social science like sociology or anthropology, practitioners would set about interviewing those committee members, scrutinising the meetings’ minutes and trying to observe as many meetings as possible. That is how the kind of fieldwork-based, “qualitative” social sciences, which economists like to discard as “soft” and unscientific, operate. It is true that this approach, too, comes with serious methodological caveats, such as verifiability, selection bias or observer bias. The difference is that other social sciences are open about these limitations, arguing that, while human knowledge about humans is fundamentally different from human knowledge about the natural world, those imperfect observations are extremely important to make.

Compare that humility to that of former central banker Alan Greenspan, one of the architects of the deregulation of finance, and a great believer in models. After the crash hit, Greenspan appeared before a congressional committee in the US to explain himself. “I made a mistake in presuming that the self-interests of organisations, specifically banks and others, were such that they were best capable of protecting their own shareholders and their equity in the firms,” said the man whom fellow economists used to celebrate as “the maestro”.




Nobel Prizes in science: strictly a man’s game?



In other words, Greenspan had been unable to imagine that bankers would run their own bank into the ground. Had the maestro read the tiny pile of books by financial anthropologists he may have found it easier to imagine such behaviour. Then he would have known that over past decades banks had adopted a “zero job security” hire-and-fire culture, breeding a “zero-loyalty” mentality that can be summarised as: “If you can be out of the door in five minutes, your horizon becomes five minutes.”

While this was apparently new to Greenspan it was not to anthropologist Karen Ho, who did years of fieldwork at a Wall Street bank. Her book Liquidated emphasises the pivotal role of zero job security at Wall Street (the same system governs the City of London). The financial sociologist Vincent Lépinay’s Codes of Finance, a book about the division in a French bank for complex financial products, describes in convincing detail how institutional memory suffers when people switch jobs frequently and at short notice.

Perhaps the most pernicious effect of the status of economics in public life has been the hegemony of technocratic thinking. Political questions about how to run society have come to be framed as technical issues, fatally diminishing politics as the arena where society debates means and ends. Take a crucial concept such as gross domestic product. As Ha-Joon Chang makes clear in 23 Things They Don’t Tell You About Capitalism, the choices about what not to include in GDP (household work, to name one) are highly ideological. The same applies to inflation, since there is nothing neutral about the decision not to give greater weight to the explosion in housing and stock market prices when calculating inflation.


  Ha-Joon Chang, pictured at the Hay-on-Wye festival, Wales. Photograph: David Levenson/Getty Images

GDP, inflation and even growth figures are not objective temperature measurements of the economy, no matter how many economists, commentators and politicians like to pretend they are. Much of economics is politics disguised as technocracy – acknowledging this might help open up the space for political debate and change that has been so lacking in the past seven years.

Would it not be extremely useful to take economics down one peg by overhauling the prize to include all social sciences? The Nobel prize for economics is not even a “real” Nobel prize anyway, having only been set up by the Swedish central bank in 1969. In recent years, it may have been awarded to more non-conventional practitioners such as the psychologist Daniel Kahneman. However, Kahneman was still rewarded for his contribution to the science of economics, still putting that field centre stage.






Think of how frequently the Nobel prize for literature elevates little-known writers or poets to the global stage, or how the peace prize stirs up a vital global conversation: Naguib Mahfouz’s Nobel introduced Arab literature to a mass audience, while last year’s prize for Kailash Satyarthi and Malala Yousafzai put the right of all children to an education on the agenda. Nobel prizes in economics, meanwhile, go to “contributions to methods of analysing economic time series with time-varying volatility” (2003) or the “analysis of trade patterns and location of economic activity” (2008).

A revamped social science Nobel prize could play a similar role, feeding the global conversation with new discoveries and insights from across the social sciences, while always emphasising the need for humility in treating knowledge by humans about humans. One good candidate would be the sociologist Zygmunt Bauman, whose writing on the “liquid modernity” of post-utopian capitalism deserves the largest audience possible. Richard Sennett and his work on the “corrosion of character” among workers in today’s economies would be another. Will economists volunteer to share their prestigious prize out of their own acccord? Their own mainstream economic assumptions about human selfishness suggest they will not.


Sunday 14 June 2015

Love, intuition and women. Science would wither without them

Boyd Tonkin in the Independent

As it sometimes does, last October the Nobel Committee for the prize in physiology or medicine split its award. Half the pot (of eight million Swedish kronor in all) went to the British-American neuroscientist John O’Keefe, the other to the Norwegian couple who have charted the grid cells in the brain that enable our pathfinding and positioning skills via a sort of “internal GPS”.

May-Britt Moser and Edvard I Moser first met at high school and have worked together over 30 years. Professor Moser (May-Britt) said after the Nobel nod: “It’s easy for us because we can have breakfast meetings almost every day.” Professor Moser (Edvard) stated: “We have a common project and a common goal … And we depend on each other for succeeding.”

“There were a lot of things that made me decide to marry Edvard,” the other Professor Moser has recalled. Not all had to do with neurological breakthroughs. Once, Edvard gave her a huge umbrella. Open it, he said. “So I opened it above my head, and it rained down small beautiful pieces of paper with little poems on about me.”

This week, another Nobel laureate in the same discipline – Sir Tim Hunt, 2001 – found himself in need of a titanium umbrella in order to fend off the media flak. The 72-year-old biochemist told a conference in South Korea that “girls” caused mayhem in the lab. “You fall in love with them, they fall in love with you and when you criticise them, they cry.” Cue the avalanche of outrage that has now driven Sir Tim – married, by the way, to the distinguished immunologist Professor Mary Collins – out of his honorary post at University College, London. In Britain, where only 13 per cent of scientific and engineering professionals are female, his off-the-cuff “banter” has gone down like a tungsten (denser than lead) balloon.

So it should. Yet the champions of equality in science who have justly hooted at Sir Tim’s antique ditty might spare a thought for the Mosers’ partnership. The Norwegian pair are not alone in fusing personal commitment with top-grade scientific collaboration. Last year, in a fascinating study for Nature, Kerri Smith reported that, according to the US National Science Foundation, “just over one-quarter of married people with doctorates had a spouse working in science or engineering”. A 2008 survey found that the proportion of research posts that went to couples had risen from 3 per cent in the 1970s to 13 per cent.

Smith consulted a range of high-flying scientific double acts. They included the Taiwanese cell biologists Lily and Yuh-Nung Jan, who have collaborated since 1967. Lily Jan praised the joint progress made possible by a “very consistent long-term camaraderie”. After years of long-distance romance and research, physicists Claudia Felser and Stuart Parkin now live together in Germany with plum posts at the Max Planck Institutes in (respectively) Dresden and Halle. “Lufthansa and United Airlines will be very unhappy,” said Parkin.

These partnerships in life and lab follow a different, far more equal, pattern to the liaison of master and muse, once common in the arts. Scientists tend not to bother much with history. But the rising number of collaborating duos will know that they can hail as their forerunners the most intellectually fertile pairing of all: between Marie Sklodowska-Curie and Pierre Curie.

Marie had plentiful Hunts of her own to vanquish. In 1903, only a late objection by a Swedish mathematician with feminist sympathies prevented her first Nobel Prize, in physics, from going to Pierre and Henri Becquerel alone. Not that the Nobel selectors learned their lesson. Lise Meitner, who first explained the significance of nuclear fission, never got the call. When Francis Crick, James Watson and Maurice Wilkins shared the Nobel for their work on DNA in 1962, no mention was made of Rosalind Franklin (who had died in 1958). Her research into the double‑helix structure had made their triumph possible.

As any woman scientist will tell you, such neglect and condescension die hard and slow. Yet the atavistic Hunt and his denouncers share a common position. Both would banish Eros from the bench. Cases such as the Mosers suggest that, in some conditions, intimate bonds may even seed creativity. Expel love from the lab, and who knows what angels of deliverance might flee as well?

Besides, in science or any other pursuit, the same seeker can benefit at different stages both from solitary striving and intimate collaboration. You will find moving proof of this in the “autobiographical notes” that Marie Curie appended to her 1923 memoir of her husband. As a lonely Polish student in 1890s Paris, she relished her independence, even at the cost of cold, hunger and isolation in a freezing garret. She wrote: “I shall always consider one of the best memories of my life that period of solitary years exclusively devoted to the studies, finally within my reach, for which I had waited so long.”

Later, as she and Pierre experimented to isolate radium and investigate its properties in a tumbledown hut on the Paris School of Physics site, another kind of bliss took hold: “It was in this miserable old shed that we passed the best and happiest years of our life, devoting our entire days to our work.” Marie and Pierre’s shared quest embraced rapture as well as reason: “One of our joys was to go into our workroom at night; we then perceived on all sides the feebly luminous silhouettes of the bottles or capsules containing our products. It was really a lovely sight and one always new to us. The glowing tubes looked like faint, fairy lights.”

Note the poetry. Sir Tim, in contrast, reveals himself as a strict dualist. Love and tears will ruin your results. On the one hand lies intellect, on the other emotion. As always, the female serves as proxy for the latter. Yet the binary mind in which Hunt believes no more exists in physics than in painting. Investigate the history of scientific discovery and you plunge into a wild labyrinth of Curie-style ecstasies, hunches, chances, blunders, windfalls, visions, guesses, serendipities and unsought “Eureka!” moments.

However, at the entrance to this theme park of happy accidents one statement should stand. Louis Pasteur said: “Chance favours only the prepared mind.” The intuitive breakthrough that rewrites all the rules happens to people who have toiled and failed, toiled again and failed better. Vision blesses the hardest workers. “I’m enough of an artist to draw freely on my imagination,” Einstein said in 1929. “Knowledge is limited. Imagination encircles the world.” But he could get away with such New Agey bromides only because he was Albert Einstein.

Still, the scientific evidence in favour of intuitions, dreams and visions is strikingly widespread. In 1865, August Kekulé slumps in front of the fire and, in a reverie, sees the atoms of the benzene molecule “twisting and moving around in a snake-like manner”. Then, “one of the snakes got hold of its own tail, and tauntingly the whole structure whirled before my eyes”.

In 1869, Dmitri Mendeleev grasps the structure of the periodic table in another dream. In a Budapest park in 1882, Nikola Tesla recites Goethe’s Faust and then imagines the electrical induction motor. “The idea came like a flash of lightning and in an instant the truth was revealed… The images I saw were wonderfully sharp and clear.”

More recently, the Nobel-winning biochemist Kary Mullis has written a Thomas Pynchon-like account of the day in 1983 when during a nocturnal drive in California he “saw” the pattern of the DNA polymerase chain reaction that kick-started genetic medicine. With his girlfriend (a chemist in the same lab), he had left for a weekend in the woods. “My little silver Honda’s front tyres pulled us through the mountains… My mind drifted back into the laboratory. DNA chains coiled and floated. Lurid blue and pink images of electric molecules injected themselves somewhere between the mountain road and my eyes…”

A self-mythologising tinge colours many such memoirs of inspiration. They uncannily tend to resemble one another. All the same, these “Eureka!” narratives have a consistent theme, of a break or rest after thwarted labour. The pioneer of quantum mechanics Paul Dirac wrote that “I found the best ideas usually came, not when one was actively striving for them, but when one was in a more relaxed state”; in his case, via “long solitary walks on Sundays”. In science, the unconscious can work hardest when the intellect has downed tools.

In which case, the flight from emotion – from Tim Hunt’s dreaded tears and love – may sterilise more than fertilise. Shun “girls”, by which he seems to mean all subjectivity, and the seeker risks falling into an antiseptic void.

But enough: it feels unscientific, to say the least, to pillory a bloke for a gaffe that shows up a culture and an epoch more than an individual man. Perhaps Sir Tim, and the Royal Society that clumsily rushed to distance itself from him despite its own distinctly patriarchal history, could lay the matter to rest with a suitable donation. It ought to go to the Marie Curie charity for terminal care, which since 1948 has enlisted science and research to strengthen love – and to dry tears.

Sunday 6 May 2012

Brain drain or not, the right to emigrate is fundamental

S A Aiyer

Socialists like health minister Ghulam Nabi Azad won't admit it, but they rather liked the Berlin Wall. They think it's morally right to keep citizens captive at home, unable to migrate for better prospects. Azad has proposed not a brick wall but a financial one: he wants all doctors going to the US for higher studies to sign a financial bond that will be forfeited if they do not return.




Sorry, but the right to emigrate is fundamental. States can curb immigration, but not emigration. The UN declaration of human rights says in Article 13, "Everyone has the right to leave any country, including his own." Article 12 of the International Covenant on Civil and Political Rights incorporates this right into treaty law. It says: "Everyone shall be free to leave any country, including his own. The above-mentioned rights shall not be subject to any restrictions except those provided by law necessary to protect national security, public order, public health or morals or the rights and freedoms of others." The public health exception relates to communicable diseases, not a shortage of doctors.



Hitler didn't give German Jews the right to migrate. Communist East Germany thought it had a right to shoot citizens attempting to escape over the Berlin Wall. The Soviet Union mostly had strict curbs on emigration, but allowed the mass exit of its Jews to Israel after the 1967 war in which Moscow backedthe Arabs. Moscow imposed a "diploma tax" on emigrants with higher education, to claw back the cost of their education. Israel often picked up the bill, leading to sneers that the Soviet Union was selling Jews. International protests obliged Moscow to abolish the tax.



Like the Soviets, Azad wants to claw back sums spent on educating doctors. Like East Germany, he seeks to erect exit barriers by denying Indian doctors a 'no objection certificate' to practice in the US. The right to emigrate does not enter his calculations: Azad does not want this azaadi!



Many Indians will back him, saying the brain drain imposes high costs on India. Well, all principles have some costs, but that's no reason to abandon them. Azad wants curbs just on doctors, but the principle applies to all Indians. Would India be better off if it had kept captive at home economists like Amartya Sen and Jagdish Bhagwati? Three Indian migrants to the US have won Nobel Prizes-Gobind Khurana (medicine) Chandra Shekhar (physics) and V Ramakrishnan (chemistry). Had they been stopped from leaving India, would they have ever risen to such heights?



Cost estimates of the brain drain are exaggerated or downright false. Remittances from overseas Indians are now around $60 billion a year. NRI bank deposits bring up to $30 billion a year. Together, they greatly exceed India's entire spending on education (around $75 billion). Even more valuable are skills brought back by returnees.



Remittances skyrocketed only after India made it easier in the 1990s for students to go abroad. One lakh per year go to the US alone. The number of US citizens of Indian origin has tripled since 1990 to three million, and the US has replaced the Gulf as the main source of remittances.



The brain drain has anyway given way to brain circulation. Youngsters going abroad actually have very limited skills. But they hugely improve their skills abroad, mainly through job experience, so returnees bring back much brainpower.



Indian returnees were relatively few during the licence-permit raj, because omnipresent controls stifled domestic opportunities. But economic liberalization has created a boom in opportunities of every sort, so more Indians are returning. Azad should note that the fast expansion of private hospitals has attracted back many doctors. Scientists, software engineers, managers and professionals of all sorts have flocked back. This carries a simple policy lesson: create opportunity, not barriers.



Millions of Indians will not come back. Yet they do not constitute a drain. They have become huge financial assets for India through remittances and investments.



They have also become a foreign policy asset. Three million Indian Americans now occupy high positions in academia, Wall Street, business and professions. They have become important political contributors, and two have entered politics and become state governors (Bobby Jindal and Nikki Haley). Indian Americans have become a formidable lobby, helping shift US policy in India's favour, to Pakistan's dismay.



However, these are secondary issues. The main issue is human freedom. The UN declaration of human rights recognizes the right to migrate. This fundamental freedom has more value by far than the financial or foreign policy value of the diaspora. Never forget this in the brain drain debate.