Search This Blog

Showing posts with label scientific. Show all posts
Showing posts with label scientific. Show all posts

Thursday 30 June 2022

Scientific Facts have a Half-Life - Life is Poker not Chess 4

Abridged and adapted from Thinking in Bets by Annie Duke





The Half-Life of Facts, by Samuel Arbesman, is a great read about how practically every fact we’ve ever known has been subject to revision or reversal. The book talks about the extinction of the coelacanth, a fish from the Late Cretaceous period. This was the period that also saw the extinction of dinosaurs and other species. In the late 1930s and independently in the mid 1950s, coelacanths were found alive and well. Arbesman quoted a list of 187 species of mammals declared extinct, more than a third of which have subsequently been discovered as un-extinct.


Given that even scientific facts have an expiration date, we would all be well advised to take a good hard look at our beliefs, which are formed and updated in a much more haphazard way than in science.


We would be better served as communicators and decision makers if we thought less about whether we are confident in our beliefs and more about how confident we are about each of our beliefs. What if, in addition to expressing what we believe, we also rated our level of confidence about the accuracy of our belief on a scale of zero to ten? Zero would mean we are certain a belief is not true. Ten would mean we are certain that our belief is true. Forcing ourselves to express how sure we are of our beliefs brings to plain sight the probabilistic nature of those beliefs, that we believe is almost never 100% or 0% accurate but, rather, somewhere in between.


Incorporating uncertainty in the way we think about what we believe creates open-mindedness, moving us closer to a more objective stance towards information that disagrees with us. We are less likely to succumb to motivated reasoning since it feels better to make small adjustments in degrees of certainty instead of having to grossly downgrade from ‘right’ to ‘’wrong’. This shifts us away from treating information that disagrees with us as a threat, as something we have to defend against, making us better able to truthseek.


There is no sin in finding out there is evidence that contradicts what we believe. The only sin is in not using that evidence as objectively as possible to refine that belief going forward. By saying, ‘I’m 80%’ and thereby communicating we aren’t sure, we open the door for others to tell us what they know. They realise they can contribute without having to confront us by saying or implying, ‘You’re wrong’. Admitting we are not sure is an invitation for help in refining our beliefs and that will make our beliefs more accurate over time as we are more likely to gather relevant information.


Acknowledging that decisions are bets based on our beliefs, getting comfortable with uncertainty and redefining right and wrong are integral to a good overall approach to decision making.


Saturday 6 June 2020

Scientific or Pseudo Knowledge? How Lancet's reputation was destroyed

The now retracted paper halted hydroxychloroquine trials. Studies like this determine how people live or die tomorrow writes James Heathers in The Guardian

 

‘At its best, peer review is a slow and careful evaluation of new research by appropriate experts. ... At its worst, it is merely window dressing that gives the unwarranted appearance of authority’. Photograph: George Frey/AFP/Getty Images


The Lancet is one of the oldest and most respected medical journals in the world. Recently, they published an article on Covid patients receiving hydroxychloroquine with a dire conclusion: the drug increases heartbeat irregularities and decreases hospital survival rates. This result was treated as authoritative, and major drug trials were immediately halted – because why treat anyone with an unsafe drug?

Now, that Lancet study has been retracted, withdrawn from the literature entirely, at the request of three of its authors who “can no longer vouch for the veracity of the primary data sources”. Given the seriousness of the topic and the consequences of the paper, this is one of the most consequential retractions in modern history.

---Also watch

---

It is natural to ask how this is possible. How did a paper of such consequence get discarded like a used tissue by some of its authors only days after publication? If the authors don’t trust it now, how did it get published in the first place?

The answer is quite simple. It happened because peer review, the formal process of reviewing scientific work before it is accepted for publication, is not designed to detect anomalous data. It makes no difference if the anomalies are due to inaccuracies, miscalculations, or outright fraud. This is not what peer review is for. While it is the internationally recognised badge of “settled science”, its value is far more complicated.

At its best, peer review is a slow and careful evaluation of new research by appropriate experts. It involves multiple rounds of revision that removes errors, strengthens analyses, and noticeably improves manuscripts.

At its worst, it is merely window dressing that gives the unwarranted appearance of authority, a cursory process which confers no real value, enforces orthodoxy, and overlooks both obvious analytical problems and outright fraud entirely.

Regardless of how any individual paper is reviewed – and the experience is usually somewhere between the above extremes – the sad truth is peer review in its entirety is struggling, and retractions like this drag its flaws into an incredibly bright spotlight.

The ballistics of this problem are well known. To start with, peer review is entirely unrewarded. The internal currency of science consists entirely of producing new papers, which form the cornerstone of your scientific reputation. There is no emphasis on reviewing the work of others. If you spend several days in a continuous back-and-forth technical exchange with authors, trying to improve their manuscript, adding new analyses, shoring up conclusions, no one will ever know your name. Neither are you paid. Peer review originally fitted under an amorphous idea of academic “service” – the tasks that scientists were supposed to perform as members of their community. This is a nice idea, but is almost invariably maintained by researchers with excellent job security. Some senior scientists are notorious for peer reviewing manuscripts rarely or even never – because it interferes with the task of producing more of their own research.

However, even if reliable volunteers for peer review can be found, it is increasingly clear that it is insufficient. The vast majority of peer-reviewed articles are never checked for any form of analytical consistency, nor can they be – journals do not require manuscripts to have accompanying data or analytical code and often will not help you obtain them from authors if you wish to see them. Authors usually have zero formal, moral, or legal requirements to share the data and analytical methods behind their experiments. Finally, if you locate a problem in a published paper and bring it to either of these parties, often the median response is no response at all – silence.

This is usually not because authors or editors are negligent or uncaring. Usually, it is because they are trying to keep up with the component difficulties of keeping their scientific careers and journals respectively afloat. Unfortunately, those goals are directly in opposition – authors publishing as much as possible means back-breaking amounts of submissions for journals. Increasingly time-poor researchers, busy with their own publications, often decline invitations to review. Subsequently, peer review is then cursory or non-analytical.

And even still, we often muddle through. Until we encounter extraordinary circumstances.






Peer review during a pandemic faces a brutal dilemma – the moral importance of releasing important information with planetary consequences quickly, versus the scientific importance of evaluating the presented work fully – while trying to recruit scientists, already busier than usual due to their disrupted lives, to review work for free. And, after this process is complete, publications face immediate scrutiny by a much larger group of engaged scientific readers than usual, who treat publications which affect the health of every living human being with the scrutiny they deserve.

The consequences are extreme. The consequences for any of us, on discovering a persistent cough and respiratory difficulties, are directly determined by this research. Papers like today’s retraction determine how people live or die tomorrow. They affect what drugs are recommended, what treatments are available, and how we get them sooner.

The immediate solution to this problem of extreme opacity, which allows flawed papers to hide in plain sight, has been advocated for years: require more transparency, mandate more scrutiny. Prioritise publishing papers which present data and analytical code alongside a manuscript. Re-analyse papers for their accuracy before publication, instead of just assessing their potential importance. Engage expert statistical reviewers where necessary, pay them if you must. Be immediately responsive to criticism, and enforce this same standard on authors. The alternative is more retractions, more missteps, more wasted time, more loss of public trust … and more death.

Monday 22 October 2018

On Sabarimala - Why are rational, scientific women upset?


By Girish Menon

“I do not wish to join any club that will accept me as a member” quipped Groucho Marx once. I will pass on this wisdom to women of menstruating age whose efforts to enter Sabarimala have been stopped by their own sisters, male priests and political activists.

In essence the Indian Supreme Court has decided in favour of allowing all women to worship at Sabarimala as failing to do so could be interpreted as discriminatory and in violation of every Indian’s fundamental right to equality guaranteed by the constitution.

The recent violence is testimony to the failure of the Indian state as it could not ensure protection to those women who wished to worship at Sabarimala. This is a replay of Ayodhya 1992 when a mob destroyed the Babri Masjid in violation of court strictures.

I have read reports that those opposed to the Supreme Court verdict in the Sabarimala case have now filed an appeal with the Supreme Court. This is a welcome move and should have been the first step in their protests instead of physically stopping women from entering the temple.

In a democracy the legislature is superior to the courts. So it should have been up to the political activists to pass legislation that suits their political views. However, such legislation will always be of secondary status to the fundamental rights of every Indian which can only be curtailed under special circumstances, if at all.

Tradition

Tradition has been quoted as the main argument to defend the practices of Sabarimala. Leaders of most denominations use this argument to protest against demands for reforms. This argument gains most momentum because of the historical failure to pass a uniform civil code bill that applies to every Indian uniformly.

In the case of Hindus, The Paliyam Satyagraha in Chendamangalam in 1947-48 enabled a break with tradition as lower caste Hindus were hereafter allowed to enter temples. Thus tradition is only a convenient term to fight against reform and modernity.

Uniting all Hindus

The demographic changes in India along with news of rapidly growing Islam and Christianity in many parts of the world give momentum to the BJP argument that ‘Hinduism is in danger’. Under such circumstances can the BJP afford to alienate Hindu women who clamour for the right to worship at Sabarimala?

In the book The Global Minotaur the author Yanis Varoufakis defines the term aporia

        that state of intense puzzlement  in which we find ourselves when our certainties fall to pieces ; when suddenly we get caught in an impasse, at a loss to explain what our eyes can see, our fingers can touch, our ears can hear. At those rare moments, as our reason valiantly struggles to fathom what the senses are reporting, our aporia humbles us and readies the prepared mind for previously unbearable truths.

For Indians the moment of aporia has arrived. Now maybe the best time to introduce the uniform civil code and ban religious conversions. Else the charge of India turning into a ‘Hindu Pakistan’ may become a self fulfilling prophecy.

On the other hand, why would modern, rational women with a scientific bent of mind and a claim that ‘God does not exist’ fight for rights to worship Ayyappa is something that I have found hard to understand.

Saturday 5 May 2018

Into the brave new age of irrationality

The assault on rationality is part of a concerted political strategy writes Sanjay Rajoura in The Hindu


Much has been written and said about the assault on liberal arts under way in India since the new political era dawned. But the real assault is on science and rationality. And it has not been difficult to mount this attack.

For long, India has as a nation proudly claimed to be a society of belief. And Indians like to assert that faith is a ‘way of life’ here. Terms such as modernity, rational thinking and scientific analysis are often frowned upon, and misdiagnosed as disrespect to Indian culture.


Freshly minted spokesmodel

In recent years, we have entered a new era. I call it the Era of Irrationality. The new Chief Minister of Tripura, Biplab Kumar Deb, is the freshly minted spokesmodel of this bold, new era.

There appears to be a relay race among people in public positions, each one making an astonishingly ridiculous claim and then passing on the baton. Mr. Deb’s claim that the Internet existed in the times of the Mahabharata is the latest. But there have been several other persons before that: Ganesh was the first example of plastic surgery, Darwin’s theory of evolution is hokum because nobody has seen monkeys turning into humans, and that Stephen Hawking had said that Vedas have a theory superior to Einstein’s E = mc2.

Such statements have made us the laughing stock of the global scientific community. But more importantly, they also undermine the significant scientific achievements we have made post-Independence.

We cannot even dismiss these as random remarks by the fringe, the babas and the sadhus. These claims are often made by public officials (it’s another matter that the babas and sadhus are now occupying several public offices). The assault on rationality is a consequence of a concerted strategy of political forces. As rational thinking thins, the same political forces fatten.

We Indians have never really adopted the scientific temper, irrespective of our education. It’s evident from our obsession with crackpot sciences such as astrology and palmistry in our daily lives. However, in the past four years, the belief in pseudo-sciences has gained a political fig leaf as have tall, unverifiable claims on science.

The cultivation of scientific temper involves asking questions and demanding empirical evidence. It has no place for blind faith. The ruling political dispensation is uncomfortable with questioning Indians. But at the same time, it also wants to come across as a dispensation that champions a 21st century modern India. Therein lies a catch-22 situation.

So, they have devised a devious strategy to invest in the culture of blind belief. They already have a willing constituency. Ludicrous statements like those mentioned above — made by leaders in positions of power with alarming frequency — go on to legitimise and boost the Era of Irrationality.

An unscientific society makes the job of an incompetent ruler a lot easier. No questions are asked; not even basic ones. The ruler has to just make a claim and the believers will worship him. Rather than conforming, a truly rational community often questions disparity, exploitation, persecution on the basis of caste, religion or gender. It demands answers and accountability for such violations, which are often based on irrational whims. Hence rationality must be on top of the casualty list followed quickly by the minorities, Dalits, women, liberals. For the ‘Irrationality project’ to succeed, the ruler needs a willing suspension of disbelief on a mass scale.


Science v. technology

The vigour with which the government is making an assault on the scientific temper only confirms that it is actually frightened of it. This is the reason why authoritarian regimes are often intolerant of those who champion the spirit of science, but encourage scientists who will launch satellites and develop nuclear weapons — even as they break coconuts, chant hymns and press “Enter” with their fingers laden with auspicious stones.

These ‘techno-scientists’ are what I call ‘the DJs of the scientific community’. And they are often the establishment’s yes-men and yes-women.

The founders of the Constitution were aware of this. Hence the words “scientific temper” and “the spirit of inquiry and reform” find place in the Constitution, along with “secular” (belatedly), “equality” and “rights”. To dismantle secularism, dilute equality and pushback rights, it is imperative to destroy a scientific temperament.

The indoctrination against the scientific temper begins very early in our lives. It starts in our families and communities where young minds are aggressively discouraged from questioning authority and asking questions. An upper caste child for example may be forced to follow customs, which among others include practising and subscribing to the age-old caste system. The same methodology is used to impose fixed gender, sexual and religious identities. As a result, we are hardwired to be casteist, majoritarian and misogynist.

The final step in the ‘Irrationality project’ is to inject with regularity, preposterous, over-the-top claims about the nation’s past. It effectively blurs vision of the present.

The world is busy studying string theory, the god particle in a cyclotron, quantum mechanics. But we are busy expanding our chest size with claims of a fantastic yore.

Tuesday 27 February 2018

Overcoming superstition - Persuasion lessons for rationalists

Rahul Siddharthan in The Hindu









The Indian Constitution is unique in listing, among fundamental duties, the duty of each citizen “to develop the scientific temper, humanism and the spirit of inquiry and reform” (Article 51A). Jawaharlal Nehru was the first to use the expression “scientific temper”, which he described with his usual lucidity in The Discovery of India (while also quoting Blaise Pascal on the limits of reason). And yet, decades later, superstitious practices abound in India, including among the highly educated.


Superstition exists

India may be unusual in the degree and variety of superstitious practices, even among the educated, but superstition exists everywhere. In his recent Editorial page article, “Science should have the last word” (The Hindu, February 17), Professor Jayant V. Narlikar, cosmologist and a life-long advocate for rationality, cites Czech astronomer Jiří Grygar’s observation that though the Soviets suppressed superstitious ideas in then-Czechoslovakia during the occupation, superstition arose again in the “free-thinking”, post-Soviet days. Superstition never went away: people just hesitated to discuss it in public.

Similarly, China suppressed superstition and occult practices during Mao Zedong’s rule. But after the economic reforms and relative openness that began in the late 1970s, superstition reportedly made a comeback, with even top party officials consulting soothsayers on their fortunes. In India, the rationalist movements of Periyar and others have barely made a dent. No country, no matter its scientific prowess, has conquered superstition.

On the positive side, internationally, increasing numbers of people live happily without need for superstition. The most appalling beliefs and rituals have largely been eradicated the world over — such as blood-letting in medicine to human sacrifice, and in India, practices such as sati. This is due to the efforts put in by social reform campaigners, education and empowerment (of women in particular). Yet, surviving superstitions can be dangerous too, for example when they contradict medical advice.


Explaining it

Why is it so hard to remove superstitions? Fundamentally, a belief may be difficult to shake off simply because of deep-seated habituation. In his memoir Surely You’re Joking, Mr. Feynman!, the physicist Richard P. Feynman wrote about being hypnotised voluntarily (hypnosis is always voluntary) on stage, doing what was asked, and thinking to himself that he was just agreeing to everything to not “disturb the situation”. Finally, the hypnotist announced that Feynman would not go straight back to his chair but would walk all around the room first. Feynman decided that this was ridiculous; he would walk straight back to his seat. “But then,” he said, “an annoying feeling came over me: I felt so uncomfortable that I couldn’t continue. I walked all the way around the hall.”

We have all had such “uncomfortable feelings” when trying to do something differently, even if it seems to be logically better: whether it’s a long-standing kitchen practice, or an entrenched approach to classroom teaching, or something else in daily life. Perhaps we are all hypnotised by our previous experiences, and superstition, in particular, is a form of deep-seated hypnosis that is very hard to undo. It is undone only when the harm is clear and evident, as in the medieval practices alluded to earlier. Such beliefs are strengthened by a confirmation bias (giving importance to facts that agree with our preconceptions and ignoring others) and other logical holes. Recent research even shows how seeing the same evidence can simultaneously strengthen oppositely-held beliefs (a phenomenon called Bayesian belief polarisation).


Disagreement in science

Dogmatism about science can be unjustified too. All scientific theories have limitations. Newton’s theories of mechanics and gravitation were superseded by Einstein’s. Einstein’s theory of gravity has no known limitations at the cosmological scale, but is incompatible with quantum mechanics. The evolution of species is an empirical fact: the fossil record attests it, and we can also observe it in action in fast-breeding species. Darwinism is a theory to explain how it occurs. Today’s version is a combination of Darwin’s original ideas, Mendelian genetics and population biology, with much empirical validation and no known failures. But it does have gaps. For example, epigenetic inheritance is not well understood and remains an active area of research. Incidentally, Dr. Narlikar in his article has suggested that Darwinism’s inability to explain the origin of life is a gap. Few evolutionary biologists would agree. Darwin’s book was after all titled The Origin of Species, and the origin of life would seem beyond its scope. But this is an example of how scientists can disagree on details while agreeing on the big picture.

How then does one eradicate superstition? Not, as the evidence suggests, by preaching or legislating against it. Awareness campaigns against dangerous superstitions along with better education and scientific outreach may have some impact but will be a slow process.

Today, the topic of “persuasion” is popular in the psychology, social science and marketing communities. Perhaps scientists have something to learn here too. Pascal, whom Nehru cited on reason, wrote on persuasion too. He observed that the first step is to see the matter from the other person’s point of view and acknowledge the validity of their perception, and then bring in its limitations. “People are generally better persuaded by the reasons which they have themselves discovered than by those which have come into the mind of others.”

Such a strategy may be more successful than the aggressive campaigns of rationalists such as Richard Dawkins. Nevertheless, “harmless” superstitions are likely to remain with humanity forever.

Tuesday 27 June 2017

Is the staggeringly profitable business of scientific publishing bad for science?

Stephen Buranyi in The Guardian


In 2011, Claudio Aspesi, a senior investment analyst at Bernstein Research in London, made a bet that the dominant firm in one of the most lucrative industries in the world was headed for a crash. Reed-Elsevier, a multinational publishing giant with annual revenues exceeding £6bn, was an investor’s darling. It was one of the few publishers that had successfully managed the transition to the internet, and a recent company report was predicting yet another year of growth. Aspesi, though, had reason to believe that that prediction – along with those of every other major financial analyst – was wrong.

The core of Elsevier’s operation is in scientific journals, the weekly or monthly publications in which scientists share their results. Despite the narrow audience, scientific publishing is a remarkably big business. With total global revenues of more than £19bn, it weighs in somewhere between the recording and the film industries in size, but it is far more profitable. In 2010, Elsevier’s scientific publishing arm reported profits of £724m on just over £2bn in revenue. It was a 36% margin – higher than Apple, Google, or Amazon posted that year.

But Elsevier’s business model seemed a truly puzzling thing. In order to make money, a traditional publisher – say, a magazine – first has to cover a multitude of costs: it pays writers for the articles; it employs editors to commission, shape and check the articles; and it pays to distribute the finished product to subscribers and retailers. All of this is expensive, and successful magazines typically make profits of around 12-15%.

The way to make money from a scientific article looks very similar, except that scientific publishers manage to duck most of the actual costs. Scientists create work under their own direction – funded largely by governments – and give it to publishers for free; the publisher pays scientific editors who judge whether the work is worth publishing and check its grammar, but the bulk of the editorial burden – checking the scientific validity and evaluating the experiments, a process known as peer review – is done by working scientists on a volunteer basis. The publishers then sell the product back to government-funded institutional and university libraries, to be read by scientists – who, in a collective sense, created the product in the first place.

It is as if the New Yorker or the Economist demanded that journalists write and edit each other’s work for free, and asked the government to foot the bill. Outside observers tend to fall into a sort of stunned disbelief when describing this setup. A 2004 parliamentary science and technology committee report on the industry drily observed that “in a traditional market suppliers are paid for the goods they provide”. A 2005 Deutsche Bank report referred to it as a “bizarre” “triple-pay” system, in which “the state funds most research, pays the salaries of most of those checking the quality of research, and then buys most of the published product”.

Scientists are well aware that they seem to be getting a bad deal. The publishing business is “perverse and needless”, the Berkeley biologist Michael Eisen wrote in a 2003 article for the Guardian, declaring that it “should be a public scandal”. Adrian Sutton, a physicist at Imperial College, told me that scientists “are all slaves to publishers. What other industry receives its raw materials from its customers, gets those same customers to carry out the quality control of those materials, and then sells the same materials back to the customers at a vastly inflated price?” (A representative of RELX Group, the official name of Elsevier since 2015, told me that it and other publishers “serve the research community by doing things that they need that they either cannot, or do not do on their own, and charge a fair price for that service”.)

Many scientists also believe that the publishing industry exerts too much influence over what scientists choose to study, which is ultimately bad for science itself. Journals prize new and spectacular results – after all, they are in the business of selling subscriptions – and scientists, knowing exactly what kind of work gets published, align their submissions accordingly. This produces a steady stream of papers, the importance of which is immediately apparent. But it also means that scientists do not have an accurate map of their field of inquiry. Researchers may end up inadvertently exploring dead ends that their fellow scientists have already run up against, solely because the information about previous failures has never been given space in the pages of the relevant scientific publications. A 2013 study, for example, reported that half of all clinical trials in the US are never published in a journal.

According to critics, the journal system actually holds back scientific progress. In a 2008 essay, Dr Neal Young of the National Institutes of Health (NIH), which funds and conducts medical research for the US government, argued that, given the importance of scientific innovation to society, “there is a moral imperative to reconsider how scientific data are judged and disseminated”.

Aspesi, after talking to a network of more than 25 prominent scientists and activists, had come to believe the tide was about to turn against the industry that Elsevier led. More and more research libraries, which purchase journals for universities, were claiming that their budgets were exhausted by decades of price increases, and were threatening to cancel their multi-million-pound subscription packages unless Elsevier dropped its prices. State organisations such as the American NIH and the German Research Foundation (DFG) had recently committed to making their research available through free online journals, and Aspesi believed that governments might step in and ensure that all publicly funded research would be available for free, to anyone. Elsevier and its competitors would be caught in a perfect storm, with their customers revolting from below, and government regulation looming above.

In March 2011, Aspesi published a report recommending that his clients sell Elsevier stock. A few months later, in a conference call between Elsevier management and investment firms, he pressed the CEO of Elsevier, Erik Engstrom, about the deteriorating relationship with the libraries. He asked what was wrong with the business if “your customers are so desperate”. Engstrom dodged the question. Over the next two weeks, Elsevier stock tumbled by more than 20%, losing £1bn in value. The problems Aspesi saw were deep and structural, and he believed they would play out over the next half-decade – but things already seemed to be moving in the direction he had predicted.

Over the next year, however, most libraries backed down and committed to Elsevier’s contracts, and governments largely failed to push an alternative model for disseminating research. In 2012 and 2013, Elsevier posted profit margins of more than 40%. The following year, Aspesi reversed his recommendation to sell. “He listened to us too closely, and he got a bit burned,” David Prosser, the head of Research Libraries UK, and a prominent voice for reforming the publishing industry, told me recently. Elsevier was here to stay.

Illustration: Dom McKenzie

Aspesi was not the first person to incorrectly predict the end of the scientific publishing boom, and he is unlikely to be the last. It is hard to believe that what is essentially a for-profit oligopoly functioning within an otherwise heavily regulated, government-funded enterprise can avoid extinction in the long run. But publishing has been deeply enmeshed in the science profession for decades. Today, every scientist knows that their career depends on being published, and professional success is especially determined by getting work into the most prestigious journals. The long, slow, nearly directionless work pursued by some of the most influential scientists of the 20th century is no longer a viable career option. Under today’s system, the father of genetic sequencing, Fred Sanger, who published very little in the two decades between his 1958 and 1980 Nobel prizes, may well have found himself out of a job.

Even scientists who are fighting for reform are often not aware of the roots of the system: how, in the boom years after the second world war, entrepreneurs built fortunes by taking publishing out of the hands of scientists and expanding the business on a previously unimaginable scale. And no one was more transformative and ingenious than Robert Maxwell, who turned scientific journals into a spectacular money-making machine that bankrolled his rise in British society. Maxwell would go on to become an MP, a press baron who challenged Rupert Murdoch, and one of the most notorious figures in British life. But his true importance was far larger than most of us realise. Improbable as it might sound, few people in the last century have done more to shape the way science is conducted today than Maxwell.

In 1946, the 23-year-old Robert Maxwell was working in Berlin and already had a significant reputation. Although he had grown up in a poor Czech village, he had fought for the British army during the war as part of a contingent of European exiles, winning a Military Cross and British citizenship in the process. After the war, he served as an intelligence officer in Berlin, using his nine languages to interrogate prisoners. Maxwell was tall, brash, and not at all content with his already considerable success – an acquaintance at the time recalled him confessing his greatest desire: “to be a millionaire”.

At the same time, the British government was preparing an unlikely project that would allow him to do just that. Top British scientists – from Alexander Fleming, who discovered penicillin, to the physicist Charles Galton Darwin, grandson of Charles Darwin – were concerned that while British science was world-class, its publishing arm was dismal. Science publishers were mainly known for being inefficient and constantly broke. Journals, which often appeared on cheap, thin paper, were produced almost as an afterthought by scientific societies. The British Chemical Society had a months-long backlog of articles for publication, and relied on cash handouts from the Royal Society to run its printing operations.

The government’s solution was to pair the venerable British publishing house Butterworths (now owned by Elsevier) with the renowned German publisher Springer, to draw on the latter’s expertise. Butterworths would learn to turn a profit on journals, and British science would get its work out at a faster pace. Maxwell had already established his own business helping Springer ship scientific articles to Britain. The Butterworths directors, being ex-British intelligence themselves, hired the young Maxwell to help manage the company, and another ex-spook, Paul Rosbaud, a metallurgist who spent the war passing Nazi nuclear secrets to the British through the French and Dutch resistance, as scientific editor.

They couldn’t have begun at a better time. Science was about to enter a period of unprecedented growth, having gone from being a scattered, amateur pursuit of wealthy gentleman to a respected profession. In the postwar years, it would become a byword for progress. “Science has been in the wings. It should be brought to the centre of the stage – for in it lies much of our hope for the future,” wrote the American engineer and Manhattan Project administrator Vannevar Bush, in a 1945 report to President Harry S Truman. After the war, government emerged for the first time as the major patron of scientific endeavour, not just in the military, but through newly created agencies such as the US National Science Foundation, and the rapidly expanding university system.

When Butterworths decided to abandon the fledgling project in 1951, Maxwell offered £13,000 (about £420,000 today) for both Butterworth’s and Springer’s shares, giving him control of the company. Rosbaud stayed on as scientific director, and named the new venture Pergamon Press, after a coin from the ancient Greek city of Pergamon, featuring Athena, goddess of wisdom, which they adapted for the company’s logo – a simple line drawing appropriately representing both knowledge and money.

In an environment newly flush with cash and optimism, it was Rosbaud who pioneered the method that would drive Pergamon’s success. As science expanded, he realised that it would need new journals to cover new areas of study. The scientific societies that had traditionally created journals were unwieldy institutions that tended to move slowly, hampered by internal debates between members about the boundaries of their field. Rosbaud had none of these constraints. All he needed to do was to convince a prominent academic that their particular field required a new journal to showcase it properly, and install that person at the helm of it. Pergamon would then begin selling subscriptions to university libraries, which suddenly had a lot of government money to spend.

Maxwell was a quick study. In 1955, he and Rosbaud attended the Geneva Conference on Peaceful Uses of Atomic Energy. Maxwell rented an office near the conference and wandered into seminars and official functions offering to publish any papers the scientists had come to present, and asking them to sign exclusive contracts to edit Pergamon journals. Other publishers were shocked by his brash style. Daan Frank, of North Holland Publishing (now owned by Elsevier) would later complain that Maxwell was “dishonest” for scooping up scientists without regard for specific content.

Rosbaud, too, was reportedly put off by Maxwell’s hunger for profit. Unlike the humble former scientist, Maxwell favoured expensive suits and slicked-back hair. Having rounded his Czech accent into a formidably posh, newsreader basso, he looked and sounded precisely like the tycoon he wished to be. In 1955, Rosbaud told the Nobel prize-winning physicist Nevill Mott that the journals were his beloved little “ewe lambs”, and Maxwell was the biblical King David, who would butcher and sell them for profit. In 1956, the pair had a falling out, and Rosbaud left the company.

By then, Maxwell had taken Rosbaud’s business model and turned it into something all his own. Scientific conferences tended to be drab, low-ceilinged affairs, but when Maxwell returned to the Geneva conference that year, he rented a house in nearby Collonge-Bellerive, a picturesque town on the lakeshore, where he entertained guests at parties with booze, cigars and sailboat trips. Scientists had never seen anything like him. “He always said we don’t compete on sales, we compete on authors,” Albert Henderson, a former deputy director at Pergamon, told me. “We would attend conferences specifically looking to recruit editors for new journals.” There are tales of parties on the roof of the Athens Hilton, of gifts of Concorde flights, of scientists being put on a chartered boat tour of the Greek islands to plan their new journal.

By 1959, Pergamon was publishing 40 journals; six years later it would publish 150. This put Maxwell well ahead of the competition. (In 1959, Pergamon’s rival, Elsevier, had just 10 English-language journals, and it would take the company another decade to reach 50.) By 1960, Maxwell had taken to being driven in a chauffeured Rolls-Royce, and moved his home and the Pergamon operation from London to the palatial Headington Hill Hall estate in Oxford, which was also home to the British book publishing house Blackwell’s.

Scientific societies, such as the British Society of Rheology, seeing the writing on the wall, even began letting Pergamon take over their journals for a small regular fee. Leslie Iversen, former editor at the Journal of Neurochemistry, recalls being wooed with lavish dinners at Maxwell’s estate. “He was very impressive, this big entrepreneur,” said Iversen. “We would get dinner and fine wine, and at the end he would present us a cheque – a few thousand pounds for the society. It was more money than us poor scientists had ever seen.”

Maxwell insisted on grand titles – “International Journal of” was a favourite prefix. Peter Ashby, a former vice president at Pergamon, described this to me as a “PR trick”, but it also reflected a deep understanding of how science, and society’s attitude to science, had changed. Collaborating and getting your work seen on the international stage was becoming a new form of prestige for researchers, and in many cases Maxwell had the market cornered before anyone else realised it existed. When the Soviet Union launched Sputnik, the first man-made satellite, in 1959, western scientists scrambled to catch up on Russian space research, and were surprised to learn that Maxwell had already negotiated an exclusive English-language deal to publish the Russian Academy of Sciences’ journals earlier in the decade.

“He had interests in all of these places. I went to Japan, he had an American man running an office there by himself. I went to India, there was someone there,” said Ashby. And the international markets could be extremely lucrative. Ronald Suleski, who ran Pergamon’s Japanese office in the 1970s, told me that the Japanese scientific societies, desperate to get their work published in English, gave Maxwell the rights to their members’ results for free.

In a letter celebrating Pergamon’s 40th anniversary, Eiichi Kobayashi, director of Maruzen, Pergamon’s longtime Japanese distributor, recalled of Maxwell that “each time I have the pleasure of meeting him, I am reminded of F Scott Fitzgerald’s words that a millionaire is no ordinary man”.

The scientific article has essentially become the only way science is systematically represented in the world. (As Robert Kiley, head of digital services at the library of the Wellcome Trust, the world’s second-biggest private funder of biomedical research, puts it: “We spend a billion pounds a year, and we get back articles.”) It is the primary resource of our most respected realm of expertise. “Publishing is the expression of our work. A good idea, a conversation or correspondence, even from the most brilliant person in the world … doesn’t count for anything unless you have it published,” says Neal Young of the NIH. If you control access to the scientific literature, it is, to all intents and purposes, like controlling science.

Maxwell’s success was built on an insight into the nature of scientific journals that would take others years to understand and replicate. While his competitors groused about him diluting the market, Maxwell knew that there was, in fact, no limit to the market. Creating The Journal of Nuclear Energy didn’t take business away from rival publisher North Holland’s journal Nuclear Physics. Scientific articles are about unique discoveries: one article cannot substitute for another. If a serious new journal appeared, scientists would simply request that their university library subscribe to that one as well. If Maxwell was creating three times as many journals as his competition, he would make three times more money.

The only potential limit was a slow-down in government funding, but there was little sign of that happening. In the 1960s, Kennedy bankrolled the space programme, and at the outset of the 1970s Nixon declared a “war on cancer”, while at the same time the British government developed its own nuclear programme with American aid. No matter the political climate, science was buoyed by great swells of government money.


  Robert Maxwell in 1985. Photograph: Terry O'Neill/Hulton/Getty

In its early days, Pergamon had been at the centre of fierce debates about the ethics of allowing commercial interests into the supposedly disinterested and profit-shunning world of science. In a 1988 letter commemorating the 40th anniversary of Pergamon, John Coales of Cambridge University noted that initially many of his friends “considered [Maxwell] the greatest villain yet unhung”.

But by the end of the 1960s, commercial publishing was considered the status quo, and publishers were seen as a necessary partner in the advancement of science. Pergamon helped turbocharge the field’s great expansion by speeding up the publication process and presenting it in a more stylish package. Scientists’ concerns about signing away their copyright were overwhelmed by the convenience of dealing with Pergamon, the shine it gave their work, and the force of Maxwell’s personality. Scientists, it seemed, were largely happy with the wolf they had let in the door.

“He was a bully, but I quite liked him,” says Denis Noble, a physiologist at Oxford University and the editor of the journal Progress in Biophysics & Molecular Biology. Occasionally, Maxwell would call Noble to his house for a meeting. “Often there would be a party going on, a nice musical ensemble, there was no barrier between his work and personal life,” Noble says. Maxwell would then proceed to alternately browbeat and charm him into splitting the biannual journal into a monthly or bimonthly publication, which would lead to an attendant increase in subscription payments.

In the end, though, Maxwell would nearly always defer to the scientists’ wishes, and scientists came to appreciate his patronly persona. “I have to confess that, quickly realising his predatory and entrepreneurial ambitions, I nevertheless took a great liking to him,” Arthur Barrett, then editor of the journal Vacuum, wrote in a 1988 piece about the publication’s early years. And the feeling was mutual. Maxwell doted on his relationships with famous scientists, who were treated with uncharacteristic deference. “He realised early on that the scientists were vitally important. He would do whatever they wanted. It drove the rest of the staff crazy,” Richard Coleman, who worked in journal production at Pergamon in the late 1960s, told me. When Pergamon was the target of a hostile takeover attempt, a 1973 Guardian article reported that journal editors threatened “to desert” rather than work for another chairman.

Maxwell had transformed the business of publishing, but the day-to-day work of science remained unchanged. Scientists still largely took their work to whichever journal was the best fit for their research area – and Maxwell was happy to publish any and all research that his editors deemed sufficiently rigorous. In the mid-1970s, though, publishers began to meddle with the practice of science itself, starting down a path that would lock scientists’ careers into the publishing system, and impose the business’s own standards on the direction of research. One journal became the symbol of this transformation.

“At the start of my career, nobody took much notice of where you published, and then everything changed in 1974 with Cell,” Randy Schekman, the Berkeley molecular biologist and Nobel prize winner, told me. Cell (now owned by Elsevier) was a journal started by Massachusetts Institute of Technology (MIT) to showcase the newly ascendant field of molecular biology. It was edited a young biologist named Ben Lewin, who approached his work with an intense, almost literary bent. Lewin prized long, rigorous papers that answered big questions – often representing years of research that would have yielded multiple papers in other venues – and, breaking with the idea that journals were passive instruments to communicate science, he rejected far more papers than he published.

What he created was a venue for scientific blockbusters, and scientists began shaping their work on his terms. “Lewin was clever. He realised scientists are very vain, and wanted to be part of this selective members club; Cell was ‘it’, and you had to get your paper in there,” Schekman said. “I was subject to this kind of pressure, too.” He ended up publishing some of his Nobel-cited work in Cell.

Suddenly, where you published became immensely important. Other editors took a similarly activist approach in the hopes of replicating Cell’s success. Publishers also adopted a metric called “impact factor,” invented in the 1960s by Eugene Garfield, a librarian and linguist, as a rough calculation of how often papers in a given journal are cited in other papers. For publishers, it became a way to rank and advertise the scientific reach of their products. The new-look journals, with their emphasis on big results, shot to the top of these new rankings, and scientists who published in “high-impact” journals were rewarded with jobs and funding. Almost overnight, a new currency of prestige had been created in the scientific world. (Garfield later referred to his creation as “like nuclear energy … a mixed blessing”.)

It is difficult to overstate how much power a journal editor now had to shape a scientist’s career and the direction of science itself. “Young people tell me all the time, ‘If I don’t publish in CNS [a common acronym for Cell/Nature/Science, the most prestigious journals in biology], I won’t get a job,” says Schekman. He compared the pursuit of high-impact publications to an incentive system as rotten as banking bonuses. “They have a very big influence on where science goes,” he said.

And so science became a strange co-production between scientists and journal editors, with the former increasingly pursuing discoveries that would impress the latter. These days, given a choice of projects, a scientist will almost always reject both the prosaic work of confirming or disproving past studies, and the decades-long pursuit of a risky “moonshot”, in favour of a middle ground: a topic that is popular with editors and likely to yield regular publications. “Academics are incentivised to produce research that caters to these demands,” said the biologist and Nobel laureate Sydney Brenner in a 2014 interview, calling the system “corrupt.”

Maxwell understood the way journals were now the kingmakers of science. But his main concern was still expansion, and he still had a keen vision of where science was heading, and which new fields of study he could colonise. Richard Charkin, the former CEO of the British publisher Macmillan, who was an editor at Pergamon in 1974, recalls Maxwell waving Watson and Crick’s one-page report on the structure of DNA at an editorial meeting and declaring that the future was in life science and its multitude of tiny questions, each of which could have its own publication. “I think we launched a hundred journals that year,” Charkin said. “I mean, Jesus wept.”

Pergamon also branched into social sciences and psychology. A series of journals prefixed “Computers and” suggest that Maxwell spotted the growing importance of digital technology. “It was endless,” Peter Ashby told me. “Oxford Polytechnic [now Oxford Brookes University] started a department of hospitality with a chef. We had to go find out who the head of the department was, make him start a journal. And boom – International Journal of Hospitality Management.”

By the late 1970s, Maxwell was also dealing with a more crowded market. “I was at Oxford University Press at that time,” Charkin told me. “We sat up and said, ‘Hell, these journals make a lot of money!” Meanwhile, in the Netherlands, Elsevier had begun expanding its English-language journals, absorbing the domestic competition in a series of acquisitions and growing at a rate of 35 titles a year.

As Maxwell had predicted, competition didn’t drive down prices. Between 1975 and 1985, the average price of a journal doubled. The New York Times reported that in 1984 it cost $2,500 to subscribe to the journal Brain Research; in 1988, it cost more than $5,000. That same year, Harvard Library overran its research journal budget by half a million dollars.

Scientists occasionally questioned the fairness of this hugely profitable business to which they supplied their work for free, but it was university librarians who first realised the trap in the market Maxwell had created. The librarians used university funds to buy journals on behalf of scientists. Maxwell was well aware of this. “Scientists are not as price-conscious as other professionals, mainly because they are not spending their own money,” he told his publication Global Business in a 1988 interview. And since there was no way to swap one journal for another, cheaper one, the result was, Maxwell continued, “a perpetual financing machine”. Librarians were locked into a series of thousands of tiny monopolies. There were now more than a million scientific articles being published a year, and they had to buy all of them at whatever price the publishers wanted.

From a business perspective, it was a total victory for Maxwell. Libraries were a captive market, and journals had improbably installed themselves as the gatekeepers of scientific prestige – meaning that scientists couldn’t simply abandon them if a new method of sharing results came along. “Were we not so naive, we would long ago have recognised our true position: that we are sitting on top of fat piles of money which clever people on all sides are trying to transfer on to their piles,” wrote the University of Michigan librarian Robert Houbeck in a trade journal in 1988. Three years earlier, despite scientific funding suffering its first multi-year dip in decades, Pergamon had reported a 47% profit margin.

Maxwell wouldn’t be around to tend his victorious empire. The acquisitive nature that drove Pergamon’s success also led him to make a surfeit of flashy but questionable investments, including the football teams Oxford United and Derby County FC, television stations around the world, and, in 1984, the UK’s Mirror newspaper group, where he began to spend more and more of his time. In 1991, to finance his impending purchase of the New York Daily News, Maxwell sold Pergamon to its quiet Dutch competitor Elsevier for £440m (£919m today).

Many former Pergamon employees separately told me that they knew it was all over for Maxwell when he made the Elsevier deal, because Pergamon was the company he truly loved. Later that year, he became mired in a series of scandals over his mounting debts, shady accounting practices, and an explosive accusation by the American journalist Seymour Hersh that he was an Israeli spy with links to arms traders. On 5 November 1991, Maxwell was found drowned off his yacht in the Canary Islands. The world was stunned, and by the next day the Mirror’s tabloid rival Sun was posing the question on everyone’s mind: “DID HE FALL … DID HE JUMP?”, its headline blared. (A third explanation, that he was pushed, would also come up.)

The story dominated the British press for months, with suspicion growing that Maxwell had committed suicide, after an investigation revealed that he had stolen more than £400m from the Mirror pension fund to service his debts. (In December 1991, a Spanish coroner’s report ruled the death accidental.) The speculation was endless: in 2003, the journalists Gordon Thomas and Martin Dillon published a book alleging that Maxwell was assassinated by Mossad to hide his spying activities. By that time, Maxwell was long gone, but the business he had started continued to thrive in new hands, reaching new levels of profit and global power over the coming decades.

If Maxwell’s genius was in expansion, Elsevier’s was in consolidation. With the purchase of Pergamon’s 400-strong catalogue, Elsevier now controlled more than 1,000 scientific journals, making it by far the largest scientific publisher in the world.

At the time of the merger, Charkin, the former Macmillan CEO, recalls advising Pierre Vinken, the CEO of Elsevier, that Pergamon was a mature business, and that Elsevier had overpaid for it. But Vinken had no doubts, Charkin recalled: “He said, ‘You have no idea how profitable these journals are once you stop doing anything. When you’re building a journal, you spend time getting good editorial boards, you treat them well, you give them dinners. Then you market the thing and your salespeople go out there to sell subscriptions, which is slow and tough, and you try to make the journal as good as possible. That’s what happened at Pergamon. And then we buy it and we stop doing all that stuff and then the cash just pours out and you wouldn’t believe how wonderful it is.’ He was right and I was wrong.”

By 1994, three years after acquiring Pergamon, Elsevier had raised its prices by 50%. Universities complained that their budgets were stretched to breaking point – the US-based Publishers Weekly reported librarians referring to a “doomsday machine” in their industry – and, for the first time, they began cancelling subscriptions to less popular journals.

Illustration: Dom McKenzie

At the time, Elsevier’s behaviour seemed suicidal. It was angering its customers just as the internet was arriving to offer them a free alternative. A 1995 Forbes article described scientists sharing results over early web servers, and asked if Elsevier was to be “The Internet’s First Victim”. But, as always, the publishers understood the market better than the academics.

In 1998, Elsevier rolled out its plan for the internet age, which would come to be called “The Big Deal”. It offered electronic access to bundles of hundreds of journals at a time: a university would pay a set fee each year – according to a report based on freedom of information requests, Cornell University’s 2009 tab was just short of $2m – and any student or professor could download any journal they wanted through Elsevier’s website. Universities signed up en masse.

Those predicting Elsevier’s downfall had assumed scientists experimenting with sharing their work for free online could slowly outcompete Elsevier’s titles by replacing them one at a time. In response, Elsevier created a switch that fused Maxwell’s thousands of tiny monopolies into one so large that, like a basic resource – say water, or power – it was impossible for universities to do without. Pay, and the scientific lights stayed on, but refuse, and up to a quarter of the scientific literature would go dark at any one institution. It concentrated immense power in the hands of the largest publishers, and Elsevier’s profits began another steep rise that would lead them into the billions by the 2010s. In 2015, a Financial Times article anointed Elsevier “the business the internet could not kill”.

Publishers are now wound so tightly around the various organs of the scientific body that no single effort has been able to dislodge them. In a 2015 report, an information scientist from the University of Montreal, Vincent Larivière, showed that Elsevier owned 24% of the scientific journal market, while Maxwell’s old partners Springer, and his crosstown rivals Wiley-Blackwell, controlled about another 12% each. These three companies accounted for half the market. (An Elsevier representative familiar with the report told me that by their own estimate they publish only 16% of the scientific literature.)

“Despite my giving sermons all over the world on this topic, it seems journals hold sway even more prominently than before,” Randy Schekman told me. It is that influence, more than the profits that drove the system’s expansion, that most frustrates scientists today.

Elsevier says its primary goal is to facilitate the work of scientists and other researchers. An Elsevier rep noted that the company publishes 1.5m papers a year; 14 million scientists entrust Elsevier to publish their results, and 800,000 scientists donate their time to help them with editing and peer-review. “We help researchers be more productive and efficient,” Alicia Wise, senior vice president of global strategic networks, told me. “And that’s a win for research institutions, and for research funders like governments.”

On the question of why so many scientists are so critical of journal publishers, Tom Reller, vice president of corporate relations at Elsevier, said: “It’s not for us to talk about other people’s motivations. We look at the numbers [of scientists who trust their results to Elsevier] and that suggests we are doing a good job.” Asked about criticisms of Elsevier’s business model, Reller said in an email that these criticisms overlooked “all the things that publishers do to add value – above and beyond the contributions that public-sector funding brings”. That, he said, is what they were charging for.

In a sense, it is not any one publisher’s fault that the scientific world seems to bend to the industry’s gravitational pull. When governments including those of China and Mexico offer financial bonuses for publishing in high-impact journals, they are not responding to a demand by any specific publisher, but following the rewards of an enormously complex system that has to accommodate the utopian ideals of science with the commercial goals of the publishers that dominate it. (“We scientists have not given a lot of thought to the water we’re swimming in,” Neal Young told me.)

Since the early 2000s, scientists have championed an alternative to subscription publishing called “open access”. This solves the difficulty of balancing scientific and commercial imperatives by simply removing the commercial element. In practice, this usually takes the form of online journals, to which scientists pay an upfront free to cover editing costs, which then ensure the work is available free to access for anyone in perpetuity. But despite the backing of some of the biggest funding agencies in the world, including the Gates Foundation and the Wellcome Trust, only about a quarter of scientific papers are made freely available at the time of their publication.

The idea that scientific research should be freely available for anyone to use is a sharp departure, even a threat, to the current system – which relies on publishers’ ability to restrict access to the scientific literature in order to maintain its immense profitability. In recent years, the most radical opposition to the status quo has coalesced around a controversial website called Sci-Hub – a sort of Napster for science that allows anyone to download scientific papers for free. Its creator, Alexandra Elbakyan, a Kazhakstani, is in hiding, facing charges of hacking and copyright infringement in the US. Elsevier recently obtained a $15m injunction (the maximum allowable amount) against her.

Elbakyan is an unabashed utopian. “Science should belong to scientists and not the publishers,” she told me in an email. In a letter to the court, she cited the cited Article 27 of the UN’s Universal Declaration of Human Rights, asserting the right “to share in scientific advancement and its benefits”.

Whatever the fate of Sci-Hub, it seems that frustration with the current system is growing. But history shows that betting against science publishers is a risky move. After all, back in 1988, Maxwell predicted that in the future there would only be a handful of immensely powerful publishing companies left, and that they would ply their trade in an electronic age with no printing costs, leading to almost “pure profit”. That sounds a lot like the world we live in now.

Friday 22 May 2015

Seven common myths about meditation


Julia Roberts learns how to meditate in the film Eat, Pray, Love. Photograph: Allstar/COLUMBIA PICTURES/Sportsphoto Ltd./Allstar

Catherine Wikholm in The Guardian

Meditation is becoming increasingly popular, and in recent years there have been calls for mindfulness (a meditative practice with Buddhist roots) to be more widely available on the NHS. Often promoted as a sure-fire way to reduce stress, it’s also being increasingly offered in schools, universities and businesses.

For the secularised mind, meditation fills a spiritual vacuum; it brings the hope of becoming a better, happier individual in a more peaceful world. However, the fact that meditation was primarily designed not to make us happier, but to destroy our sense of individual self – who we feel and think we are most of the time – is often overlooked in the science and media stories about it, which focus almost exclusively on the benefits practitioners can expect.

If you’re considering it, here are seven common beliefs about meditation that are not supported by scientific evidence.

Myth 1: Meditation never has adverse or negative effects. It will change you for the better (and only the better)

Fact 1: It’s easy to see why this myth might spring up. After all, sitting in silence and focusing on your breathing would seem like a fairly innocuous activity with little potential for harm. But when you consider how many of us, when worried or facing difficult circumstances, cope by keeping ourselves very busy and with little time to think, it isn’t that much of a surprise to find that sitting without distractions, with only ourselves, might lead to disturbing emotions rising to the surface.

However, many scientists have turned a blind eye to the potential unexpected or harmful consequences of meditation. With Transcendental Meditation, this is probably because many of those who have researched it have also been personally involved in the movement; with mindfulness, the reasons are less clear, because it is presented as a secular technique. Nevertheless, there is emerging scientific evidence from case studies, surveys of meditators’ experience and historical studies to show that meditation can be associated with stress, negative effectsand mental health problems. For example, one study found that mindfulness meditation led to increased cortisol, a biological marker of stress, despite the fact that participants subjectively reported feeling less stressed.


Myth 2: Meditation can benefit everyone

FacebookTwitterPinterest Photograph: Alamy

Fact 2: The idea that meditation is a cure-all for all lacks scientific basis. “One man’s meat is another man’s poison,” the psychologist Arnold Lazarus reminded us in his writings about meditation. Although there has been relatively little research into how individual circumstances – such as age, gender, or personality type – might play a role in the value of meditation, there is a growing awareness that meditation works differently for each individual.

For example, it may provide an effective stress-relief technique for individuals facing serious problems (such as being unemployed), but have little value for low-stressed individuals. Or it may benefit depressed individuals who suffered trauma and abuse in their childhood, but not other depressed people. There is also some evidence that – along with yoga – it can be of particular use to prisoners, for whom it improves psychological wellbeing and, perhaps more importantly, encourages better control over impulsivity. We shouldn’t be surprised about meditation having variable benefits from person to person. After all, the practice wasn’t intended to make us happier or less stressed, but to assist us in diving deep within and challenging who we believe we are.


Myth 3: If everyone meditated the world would be a much better place

Fact 3: All global religions share the belief that following their particular practices and ideals will make us better individuals. So far, there is no clear scientific evidence that meditation is more effective at making us, for example, more compassionate than other spiritual or psychological practices. Research on this topic has serious methodological and theoretical limitations and biases. Most of the studies have no adequate control groups and generally fail to assess the expectations of participants (ie, if we expect to benefit from something, we may be more likely to report benefits).


Myth 4: If you’re seeking personal change and growth, meditating is as efficient – or more – than having therapy

Fact 4: There is very little evidence that an eight-week mindfulness-based group programme has the same benefits as of being in conventional psychological therapy – most studies compare mindfulness to “treatment as usual” (such as seeing your GP), rather than one-to-one therapy. Although mindfulness interventions are group-based and most psychological therapy is conducted on a one-to-one basis, both approaches involve developing an increased awareness of our thoughts, emotions and way of relating to others. But the levels of awareness probably differ. A therapist can encourage us to examine conscious or unconscious patterns within ourselves, whereas these might be difficult to access in a one-size-fits-all group course, or if we were meditating on our own.


Myth 5: Meditation produces a unique state of consciousness that we can measure scientifically

FacebookTwitterPinterest Teachers and pupils practise meditation techniques at Bethnal Green Academy Photograph: Sean Smith for the Guardian

Fact 5: Meditation produces states of consciousness that we can indeed measure using various scientific instruments. However, the overall evidence is that these states are not physiologically unique. Furthermore, although different kinds of meditation may have diverse effects on consciousness (and on the brain), there is no scientific consensus about what these effects are.

Myth 6: We can practise meditation as a purely scientific technique with no religious or spiritual leanings

Fact 6: In principle, it’s perfectly possible to meditate and be uninterested in the spiritual background to the practice. However, research shows that meditation leads us to become more spiritual, and that this increase in spirituality is partly responsible for the practice’s positive effects. So, even if we set out to ignore meditation’s spiritual roots, those roots may nonetheless envelop us, to a greater or lesser degree. Overall, it is unclear whether secular models of mindfulness meditation are fully secular.

Myth 7: Science has unequivocally shown how meditation can change us and why

Fact 7: Meta-analyses show there is moderate evidence that meditation affects us in various ways, such as increasing positive emotions and reducing anxiety. However, it is less clear how powerful and long-lasting these changes are.

Some studies show that meditating can have a greater impact than physical relaxation, although other research using a placebo meditation contradicts this finding. We need better studies but, perhaps as important, we also need models that explain how meditation works. For example, with mindfulness-based cognitive therapy (MBCT), we still can’t be sure of the “active” ingredient. Is it the meditation itself that causes positive effects, or is it the fact that the participant learns to step back and become aware of his or her thoughts and feelings in a supportive group environment?

There simply is no cohesive, overarching attempt to describe the various psychobiological processes that meditation sets in motion. Unless we can clearly map the effects of meditation – both the positive and the negative – and identify the processes underpinning the practice, our scientific understanding of meditation is precarious and can easily lead to exaggeration and misinterpretation.

Tuesday 3 March 2015

What scares the new atheists: The vocal fervour of today’s missionary atheism conceals a panic that religion is not only refusing to decline – but in fact flourishing

John Gray in The Guardian

In 1929, the Thinker’s Library, a series established by the Rationalist Press Association to advance secular thinking and counter the influence of religion in Britain, published an English translation of the German biologist Ernst Haeckel’s 1899 book The Riddle of the Universe. Celebrated as “the German Darwin”, Haeckel was one of the most influential public intellectuals of the late nineteenth and early twentieth century; The Riddle of the Universe sold half a million copies in Germany alone, and was translated into dozens of other languages. Hostile to Jewish and Christian traditions, Haeckel devised his own “religion of science” called Monism, which incorporated an anthropology that divided the human species into a hierarchy of racial groups. Though he died in 1919, before the Nazi Party had been founded, his ideas, and widespread influence in Germany, unquestionably helped to create an intellectual climate in which policies of racial slavery and genocide were able to claim a basis in science.

The Thinker’s Library also featured works by Julian Huxley, grandson of TH Huxley, the Victorian biologist who was known as “Darwin’s bulldog” for his fierce defence of evolutionary theory. A proponent of “evolutionary humanism”, which he described as “religion without revelation”, Julian Huxley shared some of Haeckel’s views, including advocacy of eugenics. In 1931, Huxley wrote that there was “a certain amount of evidence that the negro is an earlier product of human evolution than the Mongolian or the European, and as such might be expected to have advanced less, both in body and mind”. Statements of this kind were then commonplace: there were many in the secular intelligentsia – including HG Wells, also a contributor to the Thinker’s Library – who looked forward to a time when “backward” peoples would be remade in a western mould or else vanish from the world.

But by the late 1930s, these views were becoming suspect: already in 1935, Huxley admitted that the concept of race was “hardly definable in scientific terms”. While he never renounced eugenics, little was heard from him on the subject after the second world war. The science that pronounced western people superior was bogus – but what shifted Huxley’s views wasn’t any scientific revelation: it was the rise of Nazism, which revealed what had been done under the aegis of Haeckel-style racism.

It has often been observed that Christianity follows changing moral fashions, all the while believing that it stands apart from the world. The same might be said, with more justice, of the prevalent version of atheism. If an earlier generation of unbelievers shared the racial prejudices of their time and elevated them to the status of scientific truths, evangelical atheists do the same with the liberal values to which western societies subscribe today – while looking with contempt upon “backward” cultures that have not abandoned religion. The racial theories promoted by atheists in the past have been consigned to the memory hole – and today’s most influential atheists would no more endorse racist biology than they would be seen following the guidance of an astrologer. But they have not renounced the conviction that human values must be based in science; now it is liberal values which receive that accolade. There are disputes, sometimes bitter, over how to define and interpret those values, but their supremacy is hardly ever questioned. For 21st century atheist missionaries, being liberal and scientific in outlook are one and the same.

It’s a reassuringly simple equation. In fact there are no reliable connections – whether in logic or history – between atheism, science and liberal values. When organised as a movement and backed by the power of the state, atheist ideologies have been an integral part of despotic regimes that also claimed to be based in science, such as the former Soviet Union. Many rival moralities and political systems – most of them, to date, illiberal – have attempted to assert a basis in science. All have been fraudulent and ephemeral. Yet the attempt continues in atheist movements today, which claim that liberal values can be scientifically validated and are therefore humanly universal.

Fortunately, this type of atheism isn’t the only one that has ever existed. There have been many modern atheisms, some of them more cogent and more intellectually liberating than the type that makes so much noise today. Campaigning atheism is a missionary enterprise, aiming to convert humankind to a particular version of unbelief; but not all atheists have been interested in propagating a new gospel, and some have been friendly to traditional faiths.

Evangelical atheists today view liberal values as part of an emerging global civilisation; but not all atheists, even when they have been committed liberals, have shared this comforting conviction. Atheism comes in many irreducibly different forms, among which the variety being promoted at the present time looks strikingly banal and parochial.

In itself, atheism is an entirely negative position. In pagan Rome, “atheist” (from the Greek atheos) meant anyone who refused to worship the established pantheon of deities. The term was applied to Christians, who not only refused to worship the gods of the pantheon but demanded exclusive worship of their own god. Many non-western religions contain no conception of a creator-god – Buddhism and Taoism, in some of their forms, are atheist religions of this kind – and many religions have had no interest in proselytising. In modern western contexts, however, atheism and rejection of monotheism are practically interchangeable. Roughly speaking, an atheist is anyone who has no use for the concept of God – the idea of a divine mind, which has created humankind and embodies in a perfect form the values that human beings cherish and strive to realise. Many who are atheists in this sense (including myself) regard the evangelical atheism that has emerged over the past few decades with bemusement. Why make a fuss over an idea that has no sense for you? There are untold multitudes who have no interest in waging war on beliefs that mean nothing to them. Throughout history, many have been happy to live their lives without bothering about ultimate questions. This sort of atheism is one of the perennial responses to the experience of being human.

As an organised movement, atheism is never non-committal in this way. It always goes with an alternative belief-system – typically, a set of ideas that serves to show the modern west is the high point of human development. In Europe from the late 19th century until the second world war, this was a version of evolutionary theory that marked out western peoples as being the most highly evolved. Around the time Haeckel was promoting his racial theories, a different theory of western superiority was developed by Marx. While condemning liberal societies and prophesying their doom, Marx viewed them as the high point of human development to date. (This is why he praised British colonialism in India as an essentially progressive development.) If Marx had serious reservations about Darwinism – and he did – it was because Darwin’s theory did not frame evolution as a progressive process.

The predominant varieties of atheist thinking, in the 19th and early 20th centuries, aimed to show that the secular west is the model for a universal civilisation. The missionary atheism of the present time is a replay of this theme; but the west is in retreat today, and beneath the fervour with which this atheism assaults religion there is an unmistakable mood of fear and anxiety. To a significant extent, the new atheism is the expression of a liberal moral panic.


FacebookTwitterPinterest Illustration by Christoph Hitz

Sam Harris, the American neuroscientist and author of The End of Faith: Religion, Terror and the Future of Reason (2004) and The Moral Landscape: How Science Can Determine Moral Values (2010), who was arguably the first of the “new atheists”, illustrates this point. Following many earlier atheist ideologues, he wants a “scientific morality”; but whereas earlier exponents of this sort of atheism used science to prop up values everyone would now agree were illiberal, Harris takes for granted that what he calls a “science of good and evil” cannot be other than liberal in content. (Not everyone will agree with Harris’s account of liberal values, which appears to sanction the practice of torture: “Given what many believe are the exigencies of our war on terrorism,” he wrote in 2004, “the practice of torture, in certain circumstances, would seem to be not only permissible but necessary.”)

Harris’s militancy in asserting these values seems to be largely a reaction to Islamist terrorism. For secular liberals of his generation, the shock of the 11 September attacks went beyond the atrocious loss of life they entailed. The effect of the attacks was to place a question mark over the belief that their values were spreading – slowly, and at times fitfully, but in the long run irresistibly – throughout the world. As society became ever more reliant on science, they had assumed, religion would inexorably decline. No doubt the process would be bumpy, and pockets of irrationality would linger on the margins of modern life; but religion would dwindle away as a factor in human conflict. The road would be long and winding. But the grand march of secular reason would continue, with more and more societies joining the modern west in marginalising religion. Someday, religious belief would be no more important than personal hobbies or ethnic cuisines.

Today, it’s clear that no grand march is under way. The rise of violent jihadism is only the most obvious example of a rejection of secular life. Jihadist thinking comes in numerous varieties, mixing strands from 20th century ideologies, such as Nazism and Leninism, with elements deriving from the 18th century Wahhabist Islamic fundamentalist movement. What all Islamist movements have in common is a categorical rejection of any secular realm. But the ongoing reversal in secularisation is not a peculiarly Islamic phenomenon.

The resurgence of religion is a worldwide development. Russian Orthodoxy is stronger than it has been for over a century, while China is the scene of a reawakening of its indigenous faiths and of underground movements that could make it the largest Christian country in the world by the end of this century. Despite tentative shifts in opinion that have been hailed as evidence it is becoming less pious, the US remains massively and pervasively religious – it’s inconceivable that a professed unbeliever could become president, for example.

For secular thinkers, the continuing vitality of religion calls into question the belief that history underpins their values. To be sure, there is disagreement as to the nature of these values. But pretty well all secular thinkers now take for granted that modern societies must in the end converge on some version of liberalism. Never well founded, this assumption is today clearly unreasonable. So, not for the first time, secular thinkers look to science for a foundation for their values.

It’s probably just as well that the current generation of atheists seems to know so little of the longer history of atheist movements. When they assert that science can bridge fact and value, they overlook the many incompatible value-systems that have been defended in this way. There is no more reason to think science can determine human values today than there was at the time of Haeckel or Huxley. None of the divergent values that atheists have from time to time promoted has any essential connection with atheism, or with science. How could any increase in scientific knowledge validate values such as human equality and personal autonomy? The source of these values is not science. In fact, as the most widely-read atheist thinker of all time argued, these quintessential liberal values have their origins in monotheism.

* * *

The new atheists rarely mention Friedrich Nietzsche, and when they do it is usually to dismiss him. This can’t be because Nietzsche’s ideas are said to have inspired the Nazi cult of racial inequality – an unlikely tale, given that the Nazis claimed their racism was based in science. The reason Nietzsche has been excluded from the mainstream of contemporary atheist thinking is that he exposed the problem atheism has with morality. It’s not that atheists can’t be moral – the subject of so many mawkish debates. The question is which morality an atheist should serve.

It’s a familiar question in continental Europe, where a number of thinkers have explored the prospects of a “difficult atheism” that doesn’t take liberal values for granted. It can’t be said that anything much has come from this effort. Georges Bataille’s postmodern project of “atheology” didn’t produce the godless religion he originally intended, or any coherent type of moral thinking. But at least Bataille, and other thinkers like him, understood that when monotheism has been left behind morality can’t go on as before. Among other things, the universal claims of liberal morality become highly questionable.


FacebookTwitterPinterest Illustration by Christoph Hitz

It’s impossible to read much contemporary polemic against religion without the impression that for the “new atheists” the world would be a better place if Jewish and Christian monotheism had never existed. If only the world wasn’t plagued by these troublesome God-botherers, they are always lamenting, liberal values would be so much more secure. Awkwardly for these atheists, Nietzsche understood that modern liberalism was a secular incarnation of these religious traditions. As a classical scholar, he recognised that a mystical Greek faith in reason had shaped the cultural matrix from which modern liberalism emerged. Some ancient Stoics defended the ideal of a cosmopolitan society; but this was based in the belief that humans share in the Logos, an immortal principle of rationality that was later absorbed into the conception of God with which we are familiar. Nietzsche was clear that the chief sources of liberalism were in Jewish and Christian theism: that is why he was so bitterly hostile to these religions. He was an atheist in large part because he rejected liberal values.

To be sure, evangelical unbelievers adamantly deny that liberalism needs any support from theism. If they are philosophers, they will wheel out their rusty intellectual equipment and assert that those who think liberalism relies on ideas and beliefs inherited from religion are guilty of a genetic fallacy. Canonical liberal thinkers such as John Locke and Immanuel Kant may have been steeped in theism; but ideas are not falsified because they originate in errors. The far-reaching claims these thinkers have made for liberal values can be detached from their theistic beginnings; a liberal morality that applies to all human beings can be formulated without any mention of religion. Or so we are continually being told. The trouble is that it’s hard to make any sense of the idea of a universal morality without invoking an understanding of what it is to be human that has been borrowed from theism. The belief that the human species is a moral agent struggling to realise its inherent possibilities – the narrative of redemption that sustains secular humanists everywhere – is a hollowed-out version of a theistic myth. The idea that the human species is striving to achieve any purpose or goal – a universal state of freedom or justice, say – presupposes a pre-Darwinian, teleological way of thinking that has no place in science. Empirically speaking, there is no such collective human agent, only different human beings with conflicting goals and values. If you think of morality in scientific terms, as part of the behaviour of the human animal, you find that humans don’t live according to iterations of a single universal code. Instead, they have fashioned many ways of life. A plurality of moralities is as natural for the human animal as the variety of languages.

At this point, the dread spectre of relativism tends to be raised. Doesn’t talk of plural moralities mean there can be no truth in ethics? Well, anyone who wants their values secured by something beyond the capricious human world had better join an old-fashioned religion. If you set aside any view of humankind that is borrowed from monotheism, you have to deal with human beings as you find them, with their perpetually warring values.

This isn’t the relativism celebrated by postmodernists, which holds that human values are merely cultural constructions. Humans are like other animals in having a definite nature, which shapes their experiences whether they like it or not. No one benefits from being tortured or persecuted on account of their religion or sexuality. Being chronically poor is rarely, if ever, a positive experience. Being at risk of violent death is bad for human beings whatever their culture. Such truisms could be multiplied. Universal human values can be understood as something like moral facts, marking out goods and evils that are generically human. Using these universal values, it may be possible to define a minimum standard of civilised life that every society should meet; but this minimum won’t be the liberal values of the present time turned into universal principles.

Universal values don’t add up to a universal morality. Such values are very often conflicting, and different societies resolve these conflicts in divergent ways. The Ottoman empire, during some of its history, was a haven of toleration for religious communities who were persecuted in Europe; but this pluralism did not extend to enabling individuals to move from one community to another, or to form new communities of choice, as would be required by a liberal ideal of personal autonomy. The Hapsburg empire was based on rejecting the liberal principle of national self-determination; but – possibly for that very reason – it was more protective of minorities than most of the states that succeeded it. Protecting universal values without honouring what are now seen as core liberal ideals, these archaic imperial regimes were more civilised than a great many states that exist today.

For many, regimes of this kind are imperfect examples of what all human beings secretly want – a world in which no one is unfree. The conviction that tyranny and persecution are aberrations in human affairs is at the heart of the liberal philosophy that prevails today. But this conviction is supported by faith more than evidence. Throughout history there have been large numbers who have been happy to relinquish their freedom as long as those they hate – gay people, Jews, immigrants and other minorities, for example – are deprived of freedom as well. Many have been ready to support tyranny and oppression. Billions of human beings have been hostile to liberal values, and there is no reason for thinking matters will be any different in future.

An older generation of liberal thinkers accepted this fact. As the late Stuart Hampshire put it:
“It is not only possible, but, on present evidence, probable that most conceptions of the good, and most ways of life, which are typical of commercial, liberal, industrialised societies will often seem altogether hateful to substantial minorities within these societies and even more hateful to most of the populations within traditional societies … As a liberal by philosophical conviction, I think I ought to expect to be hated, and to be found superficial and contemptible, by a large part of mankind.”

Today this a forbidden thought. How could all of humankind not want to be as we imagine ourselves to be? To suggest that large numbers hate and despise values such as toleration and personal autonomy is, for many people nowadays, an intolerable slur on the species. This is, in fact, the quintessential illusion of the ruling liberalism: the belief that all human beings are born freedom-loving and peaceful and become anything else only as a result of oppressive conditioning. But there is no hidden liberal struggling to escape from within the killers of the Islamic State and Boko Haram, any more than there was in the torturers who served the Pol Pot regime. To be sure, these are extreme cases. But in the larger sweep of history, faith-based violence and persecution, secular and religious, are hardly uncommon – and they have been widely supported. It is peaceful coexistence and the practice of toleration that are exceptional.
* * *

Considering the alternatives that are on offer, liberal societies are well worth defending. But there is no reason for thinking these societies are the beginning of a species-wide secular civilisation of the kind of which evangelical atheists dream.

In ancient Greece and Rome, religion was not separate from the rest of human activity. Christianity was less tolerant than these pagan societies, but without it the secular societies of modern times would hardly have been possible. By adopting the distinction between what is owed to Caesar and what to God, Paul and Augustine – who turned the teaching of Jesus into a universal creed – opened the way for societies in which religion was no longer coextensive with life. Secular regimes come in many shapes, some liberal, others tyrannical. Some aim for a separation of church and state as in the US and France, while others – such as the Ataturkist regime that until recently ruled in Turkey – assert state control over religion. Whatever its form, a secular state is no guarantee of a secular culture. Britain has an established church, but despite that fact – or more likely because of it – religion has a smaller role in politics than in America and is less publicly divisive than it is in France.
FacebookTwitterPinterest Illustration by Christoph Hitz

There is no sign anywhere of religion fading away, but by no means all atheists have thought the disappearance of religion possible or desirable. Some of the most prominent – including the early 19th-century poet and philosopherGiacomo Leopardi, the philosopher Arthur Schopenhauer, the Austro-Hungarian philosopher and novelist Fritz Mauthner (who published a four-volume history of atheism in the early 1920s) and Sigmund Freud, to name a few – were all atheists who accepted the human value of religion. One thing these atheists had in common was a refreshing indifference to questions of belief. Mauthner – who is remembered today chiefly because of a dismissive one-line mention in Wittgenstein’s Tractatus – suggested that belief and unbelief were both expressions of a superstitious faith in language. For him, “humanity” was an apparition which melts away along with the departing Deity. Atheism was an experiment in living without taking human concepts as realities. Intriguingly, Mauthner saw parallels between this radical atheism and the tradition of negative theology in which nothing can be affirmed of God, and described the heretical medieval Christian mystic Meister Eckhart as being an atheist in this sense.

Above all, these unevangelical atheists accepted that religion is definitively human. Though not all human beings may attach great importance to them, every society contains practices that are recognisably religious. Why should religion be universal in this way? For atheist missionaries this is a decidedly awkward question. Invariably they claim to be followers of Darwin. Yet they never ask what evolutionary function this species-wide phenomenon serves. There is an irresolvable contradiction between viewing religion naturalistically – as a human adaptation to living in the world – and condemning it as a tissue of error and illusion. What if the upshot of scientific inquiry is that a need for illusion is built into in the human mind? If religions are natural for humans and give value to their lives, why spend your life trying to persuade others to give them up?

The answer that will be given is that religion is implicated in many human evils. Of course this is true. Among other things, Christianity brought with it a type of sexual repression unknown in pagan times. Other religions have their own distinctive flaws. But the fault is not with religion, any more than science is to blame for the proliferation of weapons of mass destruction or medicine and psychology for the refinement of techniques of torture. The fault is in the intractable human animal. Like religion at its worst, contemporary atheism feeds the fantasy that human life can be remade by a conversion experience – in this case, conversion to unbelief.

Evangelical atheists at the present time are missionaries for their own values. If an earlier generation promoted the racial prejudices of their time as scientific truths, ours aims to give the illusions of contemporary liberalism a similar basis in science. It’s possible to envision different varieties of atheism developing – atheisms more like those of Freud, which didn’t replace God with a flattering image of humanity. But atheisms of this kind are unlikely to be popular. More than anything else, our unbelievers seek relief from the panic that grips them when they realise their values are rejected by much of humankind. What today’s freethinkers want is freedom from doubt, and the prevailing version of atheism is well suited to give it to them.