Search This Blog

Showing posts with label psychology. Show all posts
Showing posts with label psychology. Show all posts

Saturday 18 May 2013

How the Case for Austerity Has Crumbled


 
 
The Alchemists: Three Central Bankers and a World on Fire
by Neil Irwin
Penguin, 430 pp., $29.95                                                  
Austerity: The History of a Dangerous Idea
by Mark Blyth
Oxford University Press, 288 pp., $24.95                                                  
The Great Deformation: The Corruption of Capitalism in America
by David A. Stockman
PublicAffairs, 742 pp., $35.00                                                  
krugman_1-060613
President Barack Obama and Representative Paul Ryan at a bipartisan meeting on health insurance reform, Washington, D.C., February 2010
In normal times, an arithmetic mistake in an economics paper would be a complete nonevent as far as the wider world was concerned. But in April 2013, the discovery of such a mistake—actually, a coding error in a spreadsheet, coupled with several other flaws in the analysis—not only became the talk of the economics profession, but made headlines. Looking back, we might even conclude that it changed the course of policy.
Why? Because the paper in question, “Growth in a Time of Debt,” by the Harvard economists Carmen Reinhart and Kenneth Rogoff, had acquired touchstone status in the debate over economic policy. Ever since the paper was first circulated, austerians—advocates of fiscal austerity, of immediate sharp cuts in government spending—had cited its alleged findings to defend their position and attack their critics. Again and again, suggestions that, as John Maynard Keynes once argued, “the boom, not the slump, is the right time for austerity”—that cuts should wait until economies were stronger—were met with declarations that Reinhart and Rogoff had shown that waiting would be disastrous, that economies fall off a cliff once government debt exceeds 90 percent of GDP.
Indeed, Reinhart-Rogoff may have had more immediate influence on public debate than any previous paper in the history of economics. The 90 percent claim was cited as the decisive argument for austerity by figures ranging from Paul Ryan, the former vice-presidential candidate who chairs the House budget committee, to Olli Rehn, the top economic official at the European Commission, to the editorial board of The Washington Post. So the revelation that the supposed 90 percent threshold was an artifact of programming mistakes, data omissions, and peculiar statistical techniques suddenly made a remarkable number of prominent people look foolish.
The real mystery, however, was why Reinhart-Rogoff was ever taken seriously, let alone canonized, in the first place. Right from the beginning, critics raised strong concerns about the paper’s methodology and conclusions, concerns that should have been enough to give everyone pause. Moreover, Reinhart-Rogoff was actually the second example of a paper seized on as decisive evidence in favor of austerity economics, only to fall apart on careful scrutiny. Much the same thing happened, albeit less spectacularly, after austerians became infatuated with a paper by Alberto Alesina and Silvia Ardagna purporting to show that slashing government spending would have little adverse impact on economic growth and might even be expansionary. Surely that experience should have inspired some caution.
So why wasn’t there more caution? The answer, as documented by some of the books reviewed here and unintentionally illustrated by others, lies in both politics and psychology: the case for austerity was and is one that many powerful people want to believe, leading them to seize on anything that looks like a justification. I’ll talk about that will to believe later in this article. First, however, it’s useful to trace the recent history of austerity both as a doctrine and as a policy experiment.

1.

In the beginning was the bubble. There have been many, many books about the excesses of the boom years—in fact, too many books. For as we’ll see, the urge to dwell on the lurid details of the boom, rather than trying to understand the dynamics of the slump, is a recurrent problem for economics and economic policy. For now, suffice it to say that by the beginning of 2008 both America and Europe were poised for a fall. They had become excessively dependent on an overheated housing market, their households were too deep in debt, their financial sectors were undercapitalized and overextended.
All that was needed to collapse these houses of cards was some kind of adverse shock, and in the end the implosion of US subprime-based securities did the deed. By the fall of 2008 the housing bubbles on both sides of the Atlantic had burst, and the whole North Atlantic economy was caught up in “deleveraging,” a process in which many debtors try—or are forced—to pay down their debts at the same time.
Why is this a problem? Because of interdependence: your spending is my income, and my spending is your income. If both of us try to reduce our debt by slashing spending, both of our incomes plunge—and plunging incomes can actually make our indebtedness worse even as they also produce mass unemployment.
Students of economic history watched the process unfolding in 2008 and 2009 with a cold shiver of recognition, because it was very obviously the same kind of process that brought on the Great Depression. Indeed, early in 2009 the economic historians Barry Eichengreen and Kevin O’Rourke produced shocking charts showing that the first year of the 2008–2009 slump in trade and industrial production was fully comparable to the first year of the great global slump from 1929 to 1933.
So was a second Great Depression about to unfold? The good news was that we had, or thought we had, several big advantages over our grandfathers, helping to limit the damage. Some of these advantages were, you might say, structural, built into the way modern economies operate, and requiring no special action on the part of policymakers. Others were intellectual: surely we had learned something since the 1930s, and would not repeat our grandfathers’ policy mistakes.
On the structural side, probably the biggest advantage over the 1930s was the way taxes and social insurance programs—both much bigger than they were in 1929—acted as “automatic stabilizers.” Wages might fall, but overall income didn’t fall in proportion, both because tax collections plunged and because government checks continued to flow for Social Security, Medicare, unemployment benefits, and more. In effect, the existence of the modern welfare state put a floor on total spending, and therefore prevented the economy’s downward spiral from going too far.
On the intellectual side, modern policymakers knew the history of the Great Depression as a cautionary tale; some, including Ben Bernanke, had actually been major Depression scholars in their previous lives. They had learned from Milton Friedman the folly of letting bank runs collapse the financial system and the desirability of flooding the economy with money in times of panic. They had learned from John Maynard Keynes that under depression conditions government spending can be an effective way to create jobs. They had learned from FDR’s disastrous turn toward austerity in 1937 that abandoning monetary and fiscal stimulus too soon can be a very big mistake.
As a result, where the onset of the Great Depression was accompanied by policies that intensified the slump—interest rate hikes in an attempt to hold on to gold reserves, spending cuts in an attempt to balance budgets—2008 and 2009 were characterized by expansionary monetary and fiscal policies, especially in the United States, where the Federal Reserve not only slashed interest rates, but stepped into the markets to buy everything from commercial paper to long-term government debt, while the Obama administration pushed through an $800 billion program of tax cuts and spending increases. European actions were less dramatic—but on the other hand, Europe’s stronger welfare states arguably reduced the need for deliberate stimulus.
Now, some economists (myself included) warned from the beginning that these monetary and fiscal actions, although welcome, were too small given the severity of the economic shock. Indeed, by the end of 2009 it was clear that although the situation had stabilized, the economic crisis was deeper than policymakers had acknowledged, and likely to prove more persistent than they had imagined. So one might have expected a second round of stimulus to deal with the economic shortfall.
What actually happened, however, was a sudden reversal.

2.

Neil Irwin’s The Alchemists gives us a time and a place at which the major advanced countries abruptly pivoted from stimulus to austerity. The time was early February 2010; the place, somewhat bizarrely, was the remote Canadian Arctic settlement of Iqaluit, where the Group of Seven finance ministers held one of their regularly scheduled summits. Sometimes (often) such summits are little more than ceremonial occasions, and there was plenty of ceremony at this one too, including raw seal meat served at the last dinner (the foreign visitors all declined). But this time something substantive happened. “In the isolation of the Canadian wilderness,” Irwin writes, “the leaders of the world economy collectively agreed that their great challenge had shifted. The economy seemed to be healing; it was time for them to turn their attention away from boosting growth. No more stimulus.”
krugman_figure1-060613
How decisive was the turn in policy? Figure 1, which is taken from the IMF’s most recent World Economic Outlook, shows how real government spending behaved in this crisis compared with previous recessions; in the figure, year zero is the year before global recession (2007 in the current slump), and spending is compared with its level in that base year. What you see is that the widespread belief that we are experiencing runaway government spending is false—on the contrary, after a brief surge in 2009, government spending began falling in both Europe and the United States, and is now well below its normal trend. The turn to austerity was very real, and quite large.
On the face of it, this was a very strange turn for policy to take. Standard textbook economics says that slashing government spending reduces overall demand, which leads in turn to reduced output and employment. This may be a desirable thing if the economy is overheating and inflation is rising; alternatively, the adverse effects of reduced government spending can be offset. Central banks (the Fed, the European Central Bank, or their counterparts elsewhere) can cut interest rates, inducing more private spending. However, neither of these conditions applied in early 2010, or for that matter apply now. The major advanced economies were and are deeply depressed, with no hint of inflationary pressure. Meanwhile, short-term interest rates, which are more or less under the central bank’s control, are near zero, leaving little room for monetary policy to offset reduced government spending. So Economics 101 would seem to say that all the austerity we’ve seen is very premature, that it should wait until the economy is stronger.
The question, then, is why economic leaders were so ready to throw the textbook out the window.
One answer is that many of them never believed in that textbook stuff in the first place. The German political and intellectual establishment has never had much use for Keynesian economics; neither has much of the Republican Party in the United States. In the heat of an acute economic crisis—as in the autumn of 2008 and the winter of 2009—these dissenting voices could to some extent be shouted down; but once things had calmed they began pushing back hard.
A larger answer is the one we’ll get to later: the underlying political and psychological reasons why many influential figures hate the notions of deficit spending and easy money. Again, once the crisis became less acute, there was more room to indulge in these sentiments.
In addition to these underlying factors, however, were two more contingent aspects of the situation in early 2010: the new crisis in Greece, and the appearance of seemingly rigorous, high-quality economic research that supported the austerian position.
The Greek crisis came as a shock to almost everyone, not least the new Greek government that took office in October 2009. The incoming leadership knew it faced a budget deficit—but it was only after arriving that it learned that the previous government had been cooking the books, and that both the deficit and the accumulated stock of debt were far higher than anyone imagined. As the news sank in with investors, first Greece, then much of Europe, found itself in a new kind of crisis—one not of failing banks but of failing governments, unable to borrow on world markets.
It’s an ill wind that blows nobody good, and the Greek crisis was a godsend for anti-Keynesians. They had been warning about the dangers of deficit spending; the Greek debacle seemed to show just how dangerous fiscal profligacy can be. To this day, anyone arguing against fiscal austerity, let alone suggesting that we need another round of stimulus, can expect to be attacked as someone who will turn America (or Britain, as the case may be) into another Greece.
If Greece provided the obvious real-world cautionary tale, Reinhart and Rogoff seemed to provide the math. Their paper seemed to show not just that debt hurts growth, but that there is a “threshold,” a sort of trigger point, when debt crosses 90 percent of GDP. Go beyond that point, their numbers suggested, and economic growth stalls. Greece, of course, already had debt greater than the magic number. More to the point, major advanced countries, the United States included, were running large budget deficits and closing in on the threshold. Put Greece and Reinhart-Rogoff together, and there seemed to be a compelling case for a sharp, immediate turn toward austerity.
But wouldn’t such a turn toward austerity in an economy still depressed by private deleveraging have an immediate negative impact? Not to worry, said another remarkably influential academic paper, “Large Changes in Fiscal Policy: Taxes Versus Spending,” by Alberto Alesina and Silvia Ardagna.
One of the especially good things in Mark Blyth’s Austerity: The History of a Dangerous Idea is the way he traces the rise and fall of the idea of “expansionary austerity,” the proposition that cutting spending would actually lead to higher output. As he shows, this is very much a proposition associated with a group of Italian economists (whom he dubs “the Bocconi boys”) who made their case with a series of papers that grew more strident and less qualified over time, culminating in the 2009 analysis by Alesina and Ardagna.
In essence, Alesina and Ardagna made a full frontal assault on the Keynesian proposition that cutting spending in a weak economy produces further weakness. Like Reinhart and Rogoff, they marshaled historical evidence to make their case. According to Alesina and Ardagna, large spending cuts in advanced countries were, on average, followed by expansion rather than contraction. The reason, they suggested, was that decisive fiscal austerity created confidence in the private sector, and this increased confidence more than offset any direct drag from smaller government outlays.
As Mark Blyth documents, this idea spread like wildfire. Alesina and Ardagna made a special presentation in April 2010 to the Economic and Financial Affairs Council of the European Council of Ministers; the analysis quickly made its way into official pronouncements from the European Commission and the European Central Bank. Thus in June 2010 Jean-Claude Trichet, the then president of theECB, dismissed concerns that austerity might hurt growth:
As regards the economy, the idea that austerity measures could trigger stagnation is incorrect…. In fact, in these circumstances, everything that helps to increase the confidence of households, firms and investors in the sustainability of public finances is good for the consolidation of growth and job creation. I firmly believe that in the current circumstances confidence-inspiring policies will foster and not hamper economic recovery, because confidence is the key factor today.
This was straight Alesina-Ardagna.
By the summer of 2010, then, a full-fledged austerity orthodoxy had taken shape, becoming dominant in European policy circles and influential on this side of the Atlantic. So how have things gone in the almost three years that have passed since?

3.

Clear evidence on the effects of economic policy is usually hard to come by. Governments generally change policies reluctantly, and it’s hard to distinguish the effects of the half-measures they undertake from all the other things going on in the world. The Obama stimulus, for example, was both temporary and fairly small compared with the size of the US economy, never amounting to much more than 2 percent of GDP, and it took effect in an economy whipsawed by the biggest financial crisis in three generations. How much of what took place in 2009–2011, good or bad, can be attributed to the stimulus? Nobody really knows.
The turn to austerity after 2010, however, was so drastic, particularly in European debtor nations, that the usual cautions lose most of their force. Greece imposed spending cuts and tax increases amounting to 15 percent of GDP; Ireland and Portugal rang in with around 6 percent; and unlike the half-hearted efforts at stimulus, these cuts were sustained and indeed intensified year after year. So how did austerity actually work?
krugman_figure2-060613
The answer is that the results were disastrous—just about as one would have predicted from textbook macroeconomics. Figure 2, for example, shows what happened to a selection of European nations (each represented by a diamond-shaped symbol). The horizontal axis shows austerity measures—spending cuts and tax increases—as a share of GDP, as estimated by the International Monetary Fund. The vertical axis shows the actual percentage change in real GDP. As you can see, the countries forced into severe austerity experienced very severe downturns, and the downturns were more or less proportional to the degree of austerity.
There have been some attempts to explain away these results, notably at the European Commission. But the IMF, looking hard at the data, has not only concluded that austerity has had major adverse economic effects, it has issued what amounts to a mea culpa for having underestimated these adverse effects.*
But is there any alternative to austerity? What about the risks of excessive debt?
In early 2010, with the Greek disaster fresh in everyone’s mind, the risks of excessive debt seemed obvious; those risks seemed even greater by 2011, as Ireland, Spain, Portugal, and Italy joined the ranks of nations having to pay large interest rate premiums. But a funny thing happened to other countries with high debt levels, including Japan, the United States, and Britain: despite large deficits and rapidly rising debt, their borrowing costs remained very low. The crucial difference, as the Belgian economist Paul DeGrauwe pointed out, seemed to be whether countries had their own currencies, and borrowed in those currencies. Such countries can’t run out of money because they can print it if needed, and absent the risk of a cash squeeze, advanced nations are evidently able to carry quite high levels of debt without crisis.
Three years after the turn to austerity, then, both the hopes and the fears of the austerians appear to have been misplaced. Austerity did not lead to a surge in confidence; deficits did not lead to crisis. But wasn’t the austerity movement grounded in serious economic research? Actually, it turned out that it wasn’t—the research the austerians cited was deeply flawed.
First to go down was the notion of expansionary austerity. Even before the results of Europe’s austerity experiment were in, the Alesina-Ardagna paper was falling apart under scrutiny. Researchers at the Roosevelt Institute pointed out that none of the alleged examples of austerity leading to expansion of the economy actually took place in the midst of an economic slump; researchers at the IMF found that the Alesina-Ardagna measure of fiscal policy bore little relationship to actual policy changes. “By the middle of 2011,” Blyth writes, “empirical and theoretical support for expansionary austerity was slipping away.” Slowly, with little fanfare, the whole notion that austerity might actually boost economies slunk off the public stage.
Reinhart-Rogoff lasted longer, even though serious questions about their work were raised early on. As early as July 2010 Josh Bivens and John Irons of the Economic Policy Institute had identified both a clear mistake—a misinterpretation of US data immediately after World War II—and a severe conceptual problem. Reinhart and Rogoff, as they pointed out, offered no evidence that the correlation ran from high debt to low growth rather than the other way around, and other evidence suggested that the latter was more likely. But such criticisms had little impact; for austerians, one might say, Reinhart-Rogoff was a story too good to check.
So the revelations in April 2013 of the errors of Reinhart and Rogoff came as a shock. Despite their paper’s influence, Reinhart and Rogoff had not made their data widely available—and researchers working with seemingly comparable data hadn’t been able to reproduce their results. Finally, they made their spreadsheet available to Thomas Herndon, a graduate student at the University of Massachusetts, Amherst—and he found it very odd indeed. There was one actual coding error, although that made only a small contribution to their conclusions. More important, their data set failed to include the experience of several Allied nations—Canada, New Zealand, and Australia—that emerged from World War II with high debt but nonetheless posted solid growth. And they had used an odd weighting scheme in which each “episode” of high debt counted the same, whether it occurred during one year of bad growth or seventeen years of good growth.
Without these errors and oddities, there was still a negative correlation between debt and growth—but this could be, and probably was, mostly a matter of low growth leading to high debt, not the other way around. And the “threshold” at 90 percent vanished, undermining the scare stories being used to sell austerity.
Not surprisingly, Reinhart and Rogoff have tried to defend their work; but their responses have been weak at best, evasive at worst. Notably, they continue to write in a way that suggests, without stating outright, that debt at 90 percent ofGDP is some kind of threshold at which bad things happen. In reality, even if one ignores the issue of causality—whether low growth causes high debt or the other way around—the apparent effects on growth of debt rising from, say, 85 to 95 percent of GDP are fairly small, and don’t justify the debt panic that has been such a powerful influence on policy.
At this point, then, austerity economics is in a very bad way. Its predictions have proved utterly wrong; its founding academic documents haven’t just lost their canonized status, they’ve become the objects of much ridicule. But as I’ve pointed out, none of this (except that Excel error) should have come as a surprise: basic macroeconomics should have told everyone to expect what did, in fact, happen, and the papers that have now fallen into disrepute were obviously flawed from the start.
This raises the obvious question: Why did austerity economics get such a powerful grip on elite opinion in the first place?
krugman_2-060613

4.

Everyone loves a morality play. “For the wages of sin is death” is a much more satisfying message than “Shit happens.” We all want events to have meaning.
When applied to macroeconomics, this urge to find moral meaning creates in all of us a predisposition toward believing stories that attribute the pain of a slump to the excesses of the boom that precedes it—and, perhaps, also makes it natural to see the pain as necessary, part of an inevitable cleansing process. When Andrew Mellon told Herbert Hoover to let the Depression run its course, so as to “purge the rottenness” from the system, he was offering advice that, however bad it was as economics, resonated psychologically with many people (and still does).
By contrast, Keynesian economics rests fundamentally on the proposition that macroeconomics isn’t a morality play—that depressions are essentially a technical malfunction. As the Great Depression deepened, Keynes famously declared that “we have magneto trouble”—i.e., the economy’s troubles were like those of a car with a small but critical problem in its electrical system, and the job of the economist is to figure out how to repair that technical problem. Keynes’s masterwork, The General Theory of Employment, Interest and Money, is noteworthy—and revolutionary—for saying almost nothing about what happens in economic booms. Pre-Keynesian business cycle theorists loved to dwell on the lurid excesses that take place in good times, while having relatively little to say about exactly why these give rise to bad times or what you should do when they do. Keynes reversed this priority; almost all his focus was on how economies stay depressed, and what can be done to make them less depressed.
I’d argue that Keynes was overwhelmingly right in his approach, but there’s no question that it’s an approach many people find deeply unsatisfying as an emotional matter. And so we shouldn’t find it surprising that many popular interpretations of our current troubles return, whether the authors know it or not, to the instinctive, pre-Keynesian style of dwelling on the excesses of the boom rather than on the failures of the slump.
David Stockman’s The Great Deformation should be seen in this light. It’s an immensely long rant against excesses of various kinds, all of which, in Stockman’s vision, have culminated in our present crisis. History, to Stockman’s eyes, is a series of “sprees”: a “spree of unsustainable borrowing,” a “spree of interest rate repression,” a “spree of destructive financial engineering,” and, again and again, a “money-printing spree.” For in Stockman’s world, all economic evil stems from the original sin of leaving the gold standard. Any prosperity we may have thought we had since 1971, when Nixon abandoned the last link to gold, or maybe even since 1933, when FDR took us off gold for the first time, was an illusion doomed to end in tears. And of course, any policies aimed at alleviating the current slump will just make things worse.
In itself, Stockman’s book isn’t important. Aside from a few swipes at Republicans, it consists basically of standard goldbug bombast. But the attention the book has garnered, the ways it has struck a chord with many people, including even some liberals, suggest just how strong remains the urge to see economics as a morality play, three generations after Keynes tried to show us that it is nothing of the kind.
And powerful officials are by no means immune to that urge. In The Alchemists, Neil Irwin analyzes the motives of Jean-Claude Trichet, the president of the European Central Bank, in advocating harsh austerity policies:
Trichet embraced a view, especially common in Germany, that was rooted in a sort of moralism. Greece had spent too much and taken on too much debt. It must cut spending and reduce deficits. If it showed adequate courage and political resolve, markets would reward it with lower borrowing costs. He put a great deal of faith in the power of confidence….
Given this sort of predisposition, is it any wonder that Keynesian economics got thrown out the window, while Alesina-Ardagna and Reinhart-Rogoff were instantly canonized?
So is the austerian impulse all a matter of psychology? No, there’s also a fair bit of self-interest involved. As many observers have noted, the turn away from fiscal and monetary stimulus can be interpreted, if you like, as giving creditors priority over workers. Inflation and low interest rates are bad for creditors even if they promote job creation; slashing government deficits in the face of mass unemployment may deepen a depression, but it increases the certainty of bondholders that they’ll be repaid in full. I don’t think someone like Trichet was consciously, cynically serving class interests at the expense of overall welfare; but it certainly didn’t hurt that his sense of economic morality dovetailed so perfectly with the priorities of creditors.
It’s also worth noting that while economic policy since the financial crisis looks like a dismal failure by most measures, it hasn’t been so bad for the wealthy. Profits have recovered strongly even as unprecedented long-term unemployment persists; stock indices on both sides of the Atlantic have rebounded to pre-crisis highs even as median income languishes. It might be too much to say that those in the top 1 percent actually benefit from a continuing depression, but they certainly aren’t feeling much pain, and that probably has something to do with policymakers’ willingness to stay the austerity course.

5.

How could this happen? That’s the question many people were asking four years ago; it’s still the question many are asking today. But the “this” has changed.
Four years ago, the mystery was how such a terrible financial crisis could have taken place, with so little forewarning. The harsh lessons we had to learn involved the fragility of modern finance, the folly of trusting banks to regulate themselves, and the dangers of assuming that fancy financial arrangements have eliminated or even reduced the age-old problems of risk.
I would argue, however—self-serving as it may sound (I warned about the housing bubble, but had no inkling of how widespread a collapse would follow when it burst)—that the failure to anticipate the crisis was a relatively minor sin. Economies are complicated, ever-changing entities; it was understandable that few economists realized the extent to which short-term lending and securitization of assets such as subprime mortgages had recreated the old risks that deposit insurance and bank regulation were created to control.
I’d argue that what happened next—the way policymakers turned their back on practically everything economists had learned about how to deal with depressions, the way elite opinion seized on anything that could be used to justify austerity—was a much greater sin. The financial crisis of 2008 was a surprise, and happened very fast; but we’ve been stuck in a regime of slow growth and desperately high unemployment for years now. And during all that time policymakers have been ignoring the lessons of theory and history.
It’s a terrible story, mainly because of the immense suffering that has resulted from these policy errors. It’s also deeply worrying for those who like to believe that knowledge can make a positive difference in the world. To the extent that policymakers and elite opinion in general have made use of economic analysis at all, they have, as the saying goes, done so the way a drunkard uses a lamppost: for support, not illumination. Papers and economists who told the elite what it wanted to hear were celebrated, despite plenty of evidence that they were wrong; critics were ignored, no matter how often they got it right.
The Reinhart-Rogoff debacle has raised some hopes among the critics that logic and evidence are finally beginning to matter. But the truth is that it’s too soon to tell whether the grip of austerity economics on policy will relax significantly in the face of these revelations. For now, the broader message of the past few years remains just how little good comes from understanding.

Medical intervention is not always the answer to mental health issues









by Frank Furedi

 The fifth edition of the Diagnostic and Statistical Manual of Mental Disorders has just been published and the contents of this book should really be of interest to you. The DSM is not simply a medical handbook that provides a list of conditions worthy of the diagnosis of mental illness. It is also a secular bible that instructs people how to make sense of their predicament through the language of medicine.

With every edition of the DSM, the number of conditions diagnosed as a problem suitable for psychiatric intervention expands. You, dear reader, may be suffering from a mental illness that you never knew existed. So if like me you really get angry now and then, the DSM suggests that you may be suffering from “disruptive mood dysregulation disorder”. Or if you have the occasional senior moment, you may well be afflicted with the new diagnosis of “mild neurocognitive disorder”. And if you really feel anxious and scared about experiencing pain and discomfort, you may have “somatic symptom disorder”.
The eccentric loner, the shy stranger lacking in social skills, the naughty child, anyone who eats too much or the sexually confused teenager have all become candidates for the psychiatrist’s couch. What’s important about the DSM is that it provides a language and narrative through which the problems of existence become medicalised. And in a world where a medical diagnosis represents a claim for resources, what the DSM says really matters. The verdict of the DSM not only affects insurance and drug companies interested in their bottom line but also anxious parents who rely on a diagnosis to gain special help for their child.
Not surprisingly, the latest edition of the DSM has become a subject of controversy. Different groups of medics and psychiatrists have questioned the scientific reliability of some of the new diagnostic categories. Some have queried the dropping of the category of Asperger’s syndromeand the decision to include it under a general autism diagnosis. Others argue that the psychiatric lobby has become a captive of the pharmaceutical industry. But what is not at issue is the ethos of medicalisation promoted through this influential manual.
The term medicalisation refers to the cultural process through which a range of human experience is reinterpreted through the language of medicine. In recent decades, many everyday experiences have become redefined as issues of health that require medical intervention. Through reinterpreting existential problems such as loneliness, shyness, fear, anxiety, loss of control or grief as medical ones, the meaning people attach to them fundamentally alters.
Medical problems require treatment and rely on professional intervention to cure the patient’s illness. But why should grief or shyness or even anger be treated as a disease? And why should professionals possess a monopoly on how to interpret the pain and disappointment that people experience at different stages of their lives? The real threat posed by the expansion of mental health diagnosis is that it takes away from people the confidence that they need to make sense and give meaning to their personal experience.
The problem is not that professional advice is always misguided, but that it short-circuits the process through which people can learn how to deal with problems through their own experience. Intuition and insight gained from experience are continually compromised by professional knowledge. This has the unintentional consequence of estranging people from their own feelings and instincts since such reactions require the affirmation of the expert. In such circumstances, people’s capacity to handle relationships and to have confidence in their relationships diminishes further. In turn, this creates new opportunities for professional intervention in everyday life.
The manner in which emotional problems have become diagnosed as a form of disorder raises questions about the ability of the individual to deal with disappointment, misfortune, adversity or even the challenge of everyday life. And, sadly, when people are continually invited to make sense of their troubles through the medium of therapeutics, it severely undermines their resilience.
Once the diagnosis of illness is systematically offered as an interpretative guide for making sense of distress, people are far more likely to perceive themselves as ill. That is one reason why in Western society the number of people diagnosed as suffering from mental illness has risen exponentially. The explanation for this trend lies not in the fields of epidemiology, but in the realm of culture that invites people to classify themselves as infirm.
Recently, the British Psychological Society’s division of clinical psychology has attacked the psychiatric profession for offering a biomedical model for understanding mental distress. But its criticism was not directed at the ethos of medicalisation as such, but only at the tendency to associate mental illness with biological causes. What it offered was an alternative model of medicalisation – one where mental illness was represented as the outcome of social and psychological cause. It seems that medicalisation has become so deeply entrenched that even critics of the DSM accept its premise.
The problems of life can be painful. But this experience of existential agony must not be rebranded as an illness. Medicalisation empties experience of its creative content and assigns human beings the status of permanent patients. The promiscuous expansion of diagnosis also trivialises mental illness. Learning to distinguish between normal suffering and illness is a mark of a mature and confident culture.
Frank Furedi is a sociologist whose books include ‘Therapy Culture’

Sunday 12 May 2013

Psychiatrists under fire in mental health battle


British Psychological Society to launch attack on rival profession, casting doubt on biomedical model of mental illness
Depressed young woman
British psychologists are to say that current psychiatric diagnoses such as bipolar disorder are useless. Photograph: Justin Paget/Fuse/Getty
 
There is no scientific evidence that psychiatric diagnoses such as schizophrenia and bipolar disorder are valid or useful, according to the leading body representing Britain's clinical psychologists.
In a groundbreaking move that has already prompted a fierce backlash from psychiatrists, the British Psychological Society's division of clinical psychology (DCP) will on Monday issue a statement declaring that, given the lack of evidence, it is time for a "paradigm shift" in how the issues of mental health are understood. The statement effectively casts doubt on psychiatry's predominantly biomedical model of mental distress – the idea that people are suffering from illnesses that are treatable by doctors using drugs. The DCP said its decision to speak out "reflects fundamental concerns about the development, personal impact and core assumptions of the (diagnosis) systems", used by psychiatry.

Dr Lucy Johnstone, a consultant clinical psychologist who helped draw up the DCP's statement, said it was unhelpful to see mental health issues as illnesses with biological causes.
"On the contrary, there is now overwhelming evidence that people break down as a result of a complex mix of social and psychological circumstances – bereavement and loss, poverty and discrimination, trauma and abuse," Johnstone said. The provocative statement by the DCP has been timed to come out shortly before the release of DSM-5, the fifth edition of the American Psychiatry Association's Diagnostic and Statistical Manual of Mental Disorders.

The manual has been attacked for expanding the range of mental health issues that are classified as disorders. For example, the fifth edition of the book, the first for two decades, will classify manifestations of grief, temper tantrums and worrying about physical ill-health as the mental illnesses of major depressive disorder, disruptive mood dysregulation disorder and somatic symptom disorder, respectively.

Some of the manual's omissions are just as controversial as the manual's inclusions. The term "Asperger's disorder" will not appear in the new manual, and instead its symptoms will come under the newly added "autism spectrum disorder".

The DSM is used in a number of countries to varying degrees. Britain uses an alternative manual, the International Classification of Diseases (ICD) published by the World Health Organisation, but the DSM is still hugely influential – and controversial.

The writer Oliver James, who trained as a clinical psychologist, welcomed the DCP's decision to speak out against psychiatric diagnosis and stressed the need to move away from a biomedical model of mental distress to one that examined societal and personal factors.

Writing in today's Observer, James declares: "We need fundamental changes in how our society is organised to give parents the best chance of meeting the needs of children and to prevent the amount of adult adversity."

But Professor Sir Simon Wessely, a member of the Royal College of Psychiatrists and chair of psychological medicine at King's College London, said it was wrong to suggest psychiatry was focused only on the biological causes of mental distress. And in an accompanying Observer article he defends the need to create classification systems for mental disorder.

"A classification system is like a map," Wessely explains. "And just as any map is only provisional, ready to be changed as the landscape changes, so does classification."

Saturday 11 February 2012

Liberal Constipation

 George Monbiot

Self-deprecating, too liberal for their own good, today’s progressives stand back and watch, hands over their mouths, as the social vivisectionists of the right slice up a living society to see if its component parts can survive in isolation. Tied up in knots of reticence and self-doubt, they will not shout stop. Doing so requires an act of interruption, of presumption, for which they no longer possess a vocabulary.

Perhaps it is in the same spirit of liberal constipation that, with the exception of Charlie Brooker(1), we have been too polite to mention the study published last month in the journal Psychological Science, which revealed that people with conservative beliefs are likely to be of low intelligence(2). Paradoxically it was the Daily Mail which brought it to the attention of British readers last week(3). It feels crude, illiberal to point out that the other side is, on average, more stupid than our own. But this, the study suggests, is not unfounded generalisation but empirical fact.

It is by no means the first such paper. There is plenty of research showing that low general intelligence in childhood predicts greater prejudice towards people of different ethnicity or sexuality in adulthood(4). Open-mindedness, flexibility, trust in other people: all these require certain cognitive abilities. Understanding and accepting others—particularly “different” others—requires an enhanced capacity for abstract thinking.

But, drawing on a sample size of several thousand, correcting for both education and socioeconomic status, the new study looks embarrassingly robust. Importantly, it shows that prejudice tends not to arise directly from low intelligence, but from the conservative ideologies to which people of low intelligence are drawn. Conservative ideology is the “critical pathway” from low intelligence to racism. Those with low cognitive abilities are attracted to “right-wing ideologies that promote coherence and order” and “emphasize the maintenance of the status quo”(5). Even for someone not yet renowned for liberal reticence, this feels hard to write.

This is not to suggest that all conservatives are stupid. There are some very clever people in government, advising politicians, running thinktanks, writing for newspapers, who have acquired power and influence by promoting rightwing ideologies.

But what we now see among their parties—however intelligent their guiding spirits may be—is the abandonment of any pretence of high-minded conservatism. On both sides of the Atlantic, conservative strategists have discovered that there is no pool so shallow that several million people won’t drown in it. Whether they are promoting the idea that Barack Obama was not born in the US, that manmade climate change is an eco-fascist-communist-anarchist conspiracy or that the deficit results from the greed of the poor, they now appeal to the basest, stupidest impulses, and find that it does them no harm in the polls.

Don’t take my word for it. Listen to what two former Republican ideologues, David Frum and Mike Lofgren, have been saying. Frum warns that “conservatives have built a whole alternative knowledge system, with its own facts, its own history, its own laws of economics.”(6) The result is a “shift to ever more extreme, ever more fantasy-based ideology” which has “ominous real-world consequences for American society.”

Lofgren complains that “the crackpot outliers of two decades ago have become the vital center today”(7). The Republican party, with its “prevailing anti-intellectualism and hostility to science” is appealing to what he calls the “low-information voter” or the “misinformation voter.” While most office holders probably don’t believe the “reactionary and paranoid claptrap” they peddle, “they cynically feed the worst instincts of their fearful and angry low-information political base”.

The madness hasn’t gone as far in the UK, but the effects of the Conservative appeal to stupidity are already making themselves felt. Yesterday the Guardian reported that recipients of disability benefits, scapegoated by the government as scroungers, blamed for the deficit, now find themselves subject to a new level of hostility and threats from other people(8).

These are the perfect conditions for a billionaires’ feeding frenzy. Any party elected by misinformed, suggestible voters becomes a vehicle for undisclosed interests. A tax break for the 1% is dressed up as freedom for the 99%. The regulation that prevents big banks and corporations from exploiting us becomes an assault on the working man and woman. Those of us who discuss manmade climate change are cast as elitists by people who happily embrace the claims of Lord Monckton, Lord Lawson or thinktanks funded by ExxonMobil or the Koch brothers: now the authentic voices of the working class.

But when I survey this wreckage I wonder who the real idiots are. Confronted with mass discontent, the once-progressive major parties, as Thomas Frank laments in his latest book Pity the Billionaire, triangulate and accommodate, hesitate and prevaricate, muzzled by what he calls “terminal niceness”(9). They fail to produce a coherent analysis of what has gone wrong and why, or to make an uncluttered case for social justice, redistribution and regulation. The conceptual stupidities of conservatism are matched by the strategic stupidities of liberalism.

Yes, conservatism thrives on low intelligence and poor information. But the liberals in politics on both sides of the Atlantic continue to back off, yielding to the supremacy of the stupid. It’s turkeys all the way down.

Monday 18 July 2011

Religion and the search for meaning

Carl Jung, part 8:

Jung thought psychology could offer a language for grappling with moral ambiguities in an age of spiritual crisis
  • Jung Nietzsche
    Friedrich Nietzsche: 'We godless anti-metaphysicians still take our fire… from the flame lit by a faith that is thousands of years old.' Photograph: Jens Meyer/AP
    In 1959, two years before his death, Jung was interviewed for the BBC television programme Face to Face. The presenter, John Freeman, asked the elderly sage if he now believed in God. "Now?" Jung replied, paused and smiled. "Difficult to answer. I know. I don't need to believe, I know." What did he mean? Perhaps several things. He had spent much of the second half of his life exploring what it is to live during a period of spiritual crisis. It is manifest in the widespread search for meaning – a peculiar characteristic of the modern age: our medieval and ancient forebears showed few signs of it, if anything suffering from an excess of meaning. The crisis stems from the cultural convulsion triggered by the decline of religion in Europe. "Are we not plunging continually," Nietzsche has the "madman" ask when he announces the death of God. "Is not the greatness of this deed too great for us?" Jung read Nietzsche and agreed that it was. The slaughter of two world wars and, as if that were not enough, the subsequent proliferation of nuclear weaponry were signs of a civilisation swept along by unconscious tides that religion, like a network of dykes, once helped contain. "A secret unrest gnaws at the roots of our being," he wrote, an unrest that yearns for the divine. Nietzsche agreed that God still existed as a psychic reality too: "We godless anti-metaphysicians still take our fire … from the flame lit by a faith that is thousands of years old." And now the flame is out of control. The sense of threat – real and imagined – that Jung witnessed during his lifetime has not lessened. Ecologists such as James Lovelock now predict that the planet itself has turned against us. Or think of the war games that power an online gaming industry worth £45bn and counting. Why do so many spend so much indulging murderous fantasies? You could also point to the proliferation of new age spiritualities that take on increasingly fantastical forms. One that interested Jung was UFOs: the longing for aliens – we are without God but not without cosmic companions – coupled to tales of being "chosen" for abduction, are indicative of mass spiritual hunger. Or you might ask why a key characteristic of western culture is widespread overwork. Like the economist John Maynard Keynes, Jung wondered whether modern individuals are trying to atone for an ill-defined sense of moral failure: we are no longer sure what makes something valuable, bar an arbitrary designation of financial worth, and this transforms the humdrum need for money into a kind of worship of money. But if the world has rejected God, those who remain religious are, in part, to blame. They have suffered a loss of confidence too, Jung suggests. The powerful, fearful experience of the numinous that speaks of the mystery of life has been traded in for a variety of substitutes that no longer speak to the depths of our humanity or serve our spiritual yearning. Again, this shift is variously manifest. Theologians, for instance, will often feel more comfortable speaking of religious matters in the worldly language of the social sciences. Christians will tell you that when Jesus spoke of the kingdom of God he was really conveying a practical political vision. Or they might reduce the symbols of faith to historical events: it is as if someone with a camera outside Jerusalem, on that Sunday in 33AD, could have caught the resurrection on film. It's a process that empties faith of significance because it turns symbols into signs: symbols transmit an immediate experience that addresses the soul, whereas signs just point to facts. "We simply do not understand any more what is meant by the paradoxes contained in dogma; and the more external our understanding of them becomes the more we are affronted by their irrationality." It is perhaps this craving for immediate experience that drives the highly emotional forms of religion growing so fast in the contemporary world, though Jung would have discerned a sentimentality in them that again simplifies humankind's moral ambiguities and spiritual paradoxes. He did not believe that authentic religiosity was expressed in these peak experiences. Rather he advised people to turn towards their fears, much as the mystics welcomed the dark night of the soul. This shadow is experienced as a foe, but it is really a friend because it contains clues as to what the individual lacks, rejects and distrusts. "What our age thinks of as the 'shadow' and inferior part of the psyche contains more than something merely negative," he writes in The Undiscovered Self, an essay published in 1957. "They are potentialities of the greatest dynamism." That dynamism works by way of compensation. It aims to rebalance what has become lopsided. Hence, if at a conscious level the scientific has eclipsed the theological, the material the valuable, the emotive the spiritual, then the forces that hide in the unconscious will ineluctably make themselves felt once more. It will seem chaotic and quite possibly be destructive. But the passion also contains a prophetic voice calling humanity back to life in all its fullness. Jung is often criticised by religious thinkers for his poor theology and perennial philosophy. They are often correct, but they can also miss the main point. Jung was clear that his analytical psychology was not a new religion, neither was he a guru. "Psychology is concerned with the act of seeing and not with the construction of new religious truths," he wrote. So its role is to provide a language for grappling with what's at stake. "Since the stars have fallen from heaven and our highest symbols have paled, a secret life holds sway in the unconscious. That is why we have a psychology today, and why we speak of the unconscious. All this would be quite superfluous in an age or culture that possessed symbols." Symbols do die. "Why have the antique gods lost their prestige and their effect upon human souls? It was because the Olympic gods had served their time and a new mystery began: God became man." Which raises the question of whether the Christian dispensation has now served its time too and we await a new mystery. Perhaps we do live on the verge of a new age, of another transformation of humanity.

Tuesday 8 March 2011

Spinoza, part 1: Philosophy as a way of life

For this 17th century outsider, philosophy is like a spiritual practice, whose goal is happiness and liberation

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 7 February 2011 09.30 GMT
o larger | smaller
o Article history

Spinoza memorial at the New Church in the Hague Spinoza memorial at the New Church in The Hague. Photograph: Dan Chung for the Guardian

Although Baruch Spinoza is one of the great thinkers of the European philosophical tradition, he was not a professional scholar – he earned his modest living as a lens grinder. So, unlike many thinkers of his time, he was unconstrained by allegiance to a church, university or royal court. He was free to be faithful to the pursuit of truth. This gives his philosophy a remarkable originality and intellectual purity – and it also led to controversy and charges of heresy. In the 19th century, and perhaps even more recently, "Spinozist" was still a term of abuse among intellectuals.

In a sense, Spinoza was always an outsider – and this independence is precisely what enabled him to see through the confusions, prejudices and superstitions that prevailed in the 17th century, and to gain a fresh and radical perspective on various philosophical and religious issues. He was born, in 1632, to Jewish Portuguese parents who had fled to Amsterdam to escape persecution, so from the very beginning he was never quite a native, never completely at home. Although Spinoza was an excellent student in the Jewish schools he attended, he came to be regarded by the leaders of his community as a dangerous influence. At the age of 24 he was excluded from the Amsterdam synagogue for his "intolerable" views and practices.

Spinoza's most famous and provocative idea is that God is not the creator of the world, but that the world is part of God. This is often identified as pantheism, the doctrine that God and the world are the same thing – which conflicts with both Jewish and Christian teachings. Pantheism can be traced back to ancient Greek thought: it was probably advocated by some pre-Socratic philosophers, as well as by the Stoics. But although Spinoza – who admired many aspects of Stoicism – is regarded as the chief source of modern pantheism, he does, in fact, want to maintain the distinction between God and the world.

His originality lies in the nature of this distinction. God and the world are not two different entities, he argues, but two different aspects of a single reality. Over the next few weeks we will examine this view in more detail and consider its implications for human life. Since Spinoza presents a radical alternative to the Cartesian philosophy that has shaped our intellectual and cultural heritage, exploring his ideas may lead us to question some of our deepest assumptions.

One of the most important and distinctive features of Spinoza's philosophy is that it is practical through and through. His ideas are never merely intellectual constructions, but lead directly to a certain way of life. This is evidenced by the fact that his greatest work, which combines metaphysics, theology, epistemology, and human psychology, is called Ethics. In this book, Spinoza argues that the way to "blessedness" or "salvation" for each person involves an expansion of the mind towards an intuitive understanding of God, of the whole of nature and its laws. In other words, philosophy for Spinoza is like a spiritual practice, whose goal is happiness and liberation.

The ethical orientation of Spinoza's thought is also reflected in his own nature and conduct. Unlike most of the great philosophers, Spinoza has a reputation for living an exemplary, almost saintly life, characterised by modesty, gentleness, integrity, intellectual courage, disregard for wealth and a lack of worldly ambition. According to Bertrand Russell, Spinoza was "the noblest and most lovable of the great philosophers". Although his ideas were despised by many of his contemporaries, he attracted a number of devoted followers who gathered regularly at his home in Amsterdam to discuss his philosophy. These friends made sure that Spinoza's Ethics was published soon after his death in 1677.

Spinoza, part 2: Miracles and God's will

Spinoza's belief that miracles were an unexplained act of nature, not proof of God, proved dangerous and controversial

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 14 February 2011 09.00 GMT
o larger | smaller
o Article history

At the heart of Baruch Spinoza's philosophy is a challenge to the traditional Judeo-Christian view of the relationship between God and the world. While the Hebrew Bible and the Christian scriptures share a conception of God as the creator of the natural world and the director of human history, Spinoza argues that everything that exists is an aspect of God that expresses something of the divine nature. This idea that God is not separate from the world is expounded systematically in the Ethics, Spinoza's magnum opus. However, a more accessible introduction to Spinoza's view of the relationship between God and nature can be found in his discussion of miracles in an earlier text, the Theologico-Political Treatise. This book presents an innovative interpretation of the bible that undermines its authority as a source of truth, and questions the traditional understanding of prophecy, miracles and the divine law.

In chapter six of the Theologico-Political Treatise, Spinoza addresses the "confused ideas of the vulgar" on the subject of miracles. Ordinary people tend to regard apparently miraculous events – phenomena that seem to interrupt and conflict with the usual order of nature – as evidence of God's presence and activity. In fact, it is not just "the vulgar" who hold this view: throughout history, theologians have appealed to miracles to justify religious belief, and some continue to do so today.

For Spinoza, however, talk of miracles is evidence not of divine power, but of human ignorance. An event that appears to contravene the laws of nature is, he argues, simply a natural event whose cause is not yet understood. Underlying this view is the idea that God is not a transcendent being who can suspend nature's laws and intervene in its normal operations. On the contrary, "divine providence is identical with the course of nature". Spinoza argues that nature has a fixed and eternal order that cannot be contravened. What is usually, with a misguided anthropomorphism, called the will of God is in fact nothing other than this unchanging natural order.

From this it follows that God's presence and character is revealed not through apparently miraculous, supernatural events, but through nature itself. As Spinoza puts it: "God's nature and existence, and consequently His providence, cannot be known from miracles, but can all be much better perceived from the fixed and immutable order of nature."

Of course, this view has serious consequences for the interpretation of scripture, since both the Old and New Testaments include many descriptions of miraculous events. Spinoza does not simply dismiss these biblical narratives, but he argues that educated modern readers must distinguish between the opinions and customs of those who witnessed and recorded miracles, and what actually happened. Challenging the literal interpretation of scripture that prevailed in his times, Spinoza insists that "many things are narrated in Scripture as real, and were believed to be real, which were in fact only symbolic and imaginary".

This may seem reasonable enough to many contemporary religious believers, but Spinoza's attitude to the Bible was far ahead of its time. Today we take for granted a certain degree of cultural relativism, and most of us are ready to accept that ancient peoples understood the world differently from us, and therefore had different ideas about natural and divine causation. When it was first published in 1670, however, the Theologico-Political Treatise provoked widespread protest and condemnation. In fact, it was this reaction that made Spinoza decide to delay publication of the Ethics until after his death, to avoid more trouble.

But what are we to make of Spinoza's claim that God's will and natural law are one and the same thing? There are different ways to interpret this idea, some more conducive to religious belief than others. On the one hand, if God and nature are identical then perhaps the concept of God becomes dispensable. Why not simply abandon the idea of God altogether, and focus on improving our understanding of nature through scientific enquiry? On the other hand, Spinoza seems to be suggesting that God's role in our everyday lives is more constant, immediate and direct than for those who rely on miraculous, out-of-the-ordinary events as signs of divine activity.

And of course, the idea that the order of nature reveals the existence and essence of God leads straight to the view that nature is divine, and should be valued and even revered as such. In this way, Spinoza was an important influence on the 19th-century Romantic poets. Indeed, Spinoza's philosophy seems to bring together the Romantic and scientific worldviews, since it gives us reason both to love the natural world, and to improve our understanding of its laws.

Spinoza, part 3: What God is not

In his Ethics, Spinoza wanted to liberate readers from the dangers of ascribing human traits to God

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 21 February 2011 08.30 GMT
o larger | smaller
o Article history

Spinoza's Ethics is divided into five books, and the first of these presents an idiosyncratic philosophical argument about the existence and nature of God. We'll examine this in detail next week, but first we need to look more closely at how the Ethics challenges traditional Judeo-Christian belief in God.

The view that Spinoza wants to reject can be summed up in one word: anthropomorphism. This means attributing human characteristics to something non-human – typically, to plants or animals, or to God. There are several important implications of Spinoza's denial of anthropomorphism. First, he argues that it is wrong to think of God as possessing an intellect and a will. In fact, Spinoza's God is an entirely impersonal power, and this means that he cannot respond to human beings' requests, needs and demands. Such a God neither rewards nor punishes – and this insight rids religious belief of fear and moralism.

Second, God does not act according to reasons or purposes. In refusing this teleological conception of God, Spinoza challenged a fundamental tenet of western thought. The idea that a given phenomenon can be explained and understood with reference to a goal or purpose is a cornerstone of Aristotle's philosophy, and medieval theologians found this fitted very neatly with the biblical narrative of God's creation of the world. Aristotle's teleological account of nature was, then, adapted to the Christian doctrine of a God who made the world according to a certain plan, analogous to a human craftsman who makes artefacts to fulfil certain purposes. Typically, human values and aspirations played a prominent role in these interpretations of divine activity.

Spinoza concludes book one of the Ethics by dismissing this world view as mere "prejudice" and "superstition". Human beings, he suggests, "consider all natural things as means to their own advantage", and because of this they believe in "a ruler of nature, endowed with human freedom, who had taken care of all things for them, and made all things for their use". Moreover, people ascribe to this divine ruler their own characters and mental states, conceiving God as angry or loving, merciful or vengeful. "So it has happened that each person has thought up from his own temperament different ways of worshiping God, so that God might love him above all others, and direct the whole of nature according to the needs of his blind desire and insatiable greed," writes Spinoza.

It is interesting to compare this critique of religious "superstition" with the views of the 18th-century Scottish philosopher David Hume. In his Dialogues Concerning Natural Religion, Hume challenges the popular belief in a creator God – and he also, elsewhere, undermines appeals to miracles as evidence of divine activity. Although Hume seems to echo Spinoza on these points, there is a crucial difference between the two philosophers. Hume thinks that many aspects of Christian belief are silly and incoherent, but his alternative to such "superstition" is a healthy scepticism, which recognises that religious doctrines cannot be justified by reason or by experience. His own position is rather ambiguous, but it involves a modest and pragmatic attitude to truth and seems to lead to agnosticism.

Spinoza, on the other hand, thinks that there is a true conception of God which is accessible to human intelligence. He argues that misguided religious beliefs are dangerous precisely because they obscure this truth, and thus prevent human beings from attaining genuine happiness, or "blessedness". There is, therefore, more at stake in Spinoza's critique of popular superstition than in Hume's. For Hume, religious believers are probably wrong, but the existential consequences of their foolishness might not be particularly serious. Spinoza, by contrast, wants to liberate his readers from their ignorance in order to bring them closer to salvation.

So Spinoza is not simply an atheist and a critic of religion, nor a sceptical agnostic. On the contrary, he places a certain conception of God at the heart of his philosophy, and he describes the ideal human life as one devoted to love of this God. Moreover, while Spinoza is critical of superstition, he is sympathetic to some aspects of Jewish and Christian teaching. In particular, he argues that Jesus had a singularly direct and immediate understanding of God, and that it is therefore right to see him as the embodiment of truth, and a role model for all human beings.

Spinoza, part 4: All there is, is God

Being infinite and eternal, God has no boundaries, argues Spinoza, and everything in the world must exist within this God

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 28 February 2011 10.00 GMT
o larger | smaller
o Article history

So far in this series I've focused on Spinoza's critique of the religious and philosophical world view of his time. But what does he propose in place of anthropomorphic, anthropocentric belief in a transcendent creator God?

Spinoza begins his Ethics by defining some basic philosophical terms: substance, attribute, and mode. In offering these definitions, he is actually attempting a radical revision of the philosophical vocabulary used by Descartes, the leading thinker of his time, to conceptualise reality. When we understand these terms properly, argues Spinoza, we have to conclude that there exists only one substance – and that this is God.

Substance is a logical category that signifies independent existence: as Spinoza puts it, "by substance I understand what is conceived through itself". By contrast, attributes and modes are properties of a substance, and are therefore logically dependent on this substance. For example, we might regard a particular body as a substance, and this body is not conceptually dependent on anything else. But the body's properties, such as its weight and its colour and its shape, are qualities that cannot be conceived to exist in isolation: they must be the weight, colour and shape of a certain body.

Descartes's world view draws on Aristotelian metaphysics and scholastic theology in conceiving individual entities as distinct substances. Human beings, for example, are finite substances, while God is a special substance which is infinite and eternal. In fact, Descartes thought that each human being was composed of two substances: a mind, which has the principal attribute of thought; and a body, which has the principal attribute of extension, or physicality. This view famously leads to the difficult question of how these different substances could interact, known as the "mind-body problem".

The philosophical terminology of substance, attribute and mode makes all this sound rather technical and abstract. But Cartesian metaphysics represents a way of thinking about the world, and also about ourselves, shared by most ordinary people. We see our world as populated by discrete objects, individual things – this person over here, that person over there; this computer on the table; that tree outside, and the squirrel climbing its trunk; and so on. These individual beings have their own characteristics, or properties: size, shape, colour, etc. They might be hot or cold, quiet or noisy, still or in motion, and such qualities can be more or less changeable. This way of conceptualising reality is reflected in the structure of language: nouns say what things are, adjectives describe how they are, and verbs indicate their actions, movements and changing states. The familiar distinction between nouns, adjectives and verbs provides an approximate guide to the philosophical concepts of substance, mode and attribute.

If, as Spinoza argues, there is only one substance – God – which is infinite, then there can be nothing outside or separate from this God. Precisely because God is a limitless, boundless totality, he must be an outsideless whole, and therefore everything else that exists must be within God. Of course, these finite beings can be distinguished from God, and also from one another – just as we can distinguish between a tree and its green colour, and between the colour green and the colour blue. But we are not dealing here with the distinction between separate substances that can be conceived to exist independently from one another.

Again, this is rather abstract. As Aristotle suggested, we cannot think without images, and I find it helpful to use the image of the sea to grasp Spinoza's metaphysics. The ocean stands for God, the sole substance, and individual beings are like waves – which are modes of the sea. Each wave has its own shape that it holds for a certain time, but the wave is not separate from the sea and cannot be conceived to exist independently of it. Of course, this is only a metaphor; unlike an infinite God, an ocean has boundaries, and moreover the image of the sea represents God only in the attribute of extension. But maybe we can also imagine the mind of God – that is to say, the infinite totality of thinking – as like the sea, and the thoughts of finite beings as like waves that arise and then pass away.

Spinoza's world view brings to the fore two features of life: dependence and connectedness. Each wave is dependent on the sea, and because it is part of the sea it is connected to every other wave. The movements of one wave will influence all the rest. Likewise, each being is dependent on God, and as a part of God it is connected to every other being. As we move about and act in the world, we affect others, and we are in turn affected by everything we come into contact with.

This basic insight gives Spinoza's philosophy its religious and ethical character. In traditional religion, dependence and connectedness are often expressed using the metaphor of the family: there is a holy father, and in some cases a holy mother; and members of the community describe themselves as brothers and sisters. This vocabulary is shared by traditions as culturally diverse as Christianity, Buddhism and Islam. For Spinoza, the familial metaphor communicates a truth that can also be conveyed philosophically – through reason rather than through an image.

Spinoza, part 5: On human nature

We are not autonomous individuals but part of a greater whole, says Spinoza, and there is no such thing as human free will

*
o
o Share
o Reddit
o Buzz up
*
Comments (…)

* Clare Carlisle
*
o Clare Carlisle
o guardian.co.uk, Monday 7 March 2011 09.00 GMT
o larger | smaller
o Article history

Last week, we examined Spinoza's metaphysics, looking at how his radical reinterpretation of the philosophical terminology of substance, attribute and mode produces a new vision of reality. According to Spinoza, only God can be called a substance – that is to say, an independently existing being – and everything else is a mode of this single substance. But what does this mean for us?

One of the central questions of philosophy is: what is a human being? And this question can be posed in a more personal way: who am I? As we might by now expect, Spinoza's view of the human being challenges commonsense opinions as well as prevailing philosophical and religious ideas. We are probably inclined to think of ourselves as distinct individuals, separate from other beings. Of course, we know that we have relationships to people and objects in the world, but nevertheless we see ourselves as autonomous – a view that is reflected in the widelyheld belief that we have free will. This popular understanding of the human condition is reflected in Cartesian philosophy, which conceives human beings as substances. In fact, Descartes thought that human beings are composed of two distinct substances: a mind and a body.

For Spinoza, however, human beings are not substances, but finite modes. (Last week, I suggested that a mode is something like a wave on the sea, being a dependent, transient part of a far greater whole.) This mode has two aspects, or attributes: extension, or physical embodiment; and thought, or thinking. Crucially, Spinoza denies that there can be any causal or logical relationships across these attributes. Instead, he argues that each attribute constitutes a causal and logical order that fully expresses reality in a certain way. So a human body is a physical organism which expresses the essence of that particular being under the attribute of extension. And a human mind is an intellectual whole that expresses this same essence under the attribute of thinking.

But this is not to suggest that the mind and the body are separate entities – for this would be to fall back into the Cartesian view that they are substances. On the contrary, says Spinoza, mind and body are two aspects of a single reality, like two sides of a coin. "The mind and the body are one and the same individual, which is conceived now under the attribute of thought, now under the attribute of extension," he writes in book two of the Ethics. And for this reason, there is an exact correspondence between them: "The order and connection of ideas is the same as the order and connection of things." In fact, each human mind involves awareness of a human body.

This way of thinking has some important consequences. One of the most obvious is that it undermines dualistic and reductionist accounts of the human being. Descartes's mind-body dualism involves the claim that we are, in essence, thinking beings – that the intellectual should be privileged above the physical, reason above the body. Conversely, modern science often regards the human being as primarily a physical entity, and attempts to reduce mental activity to physical processes. In Spinoza's view, however, it is incoherent to attempt to explain the mental in terms of the physical, or vice versa, because thinking and extension are distinct explanatory orders. They offer two alternative ways of describing and understanding our world, and ourselves, which are equally complete and equally legitimate.

Another important consequence of Spinoza's account of the human being is his denial of free will. If we are modes rather than substances, then we cannot be self-determining. The human body is part of a network of physical causality, and the human mind is part of a network of logical relations. In other words, both our bodily movements and our thinking are constrained by certain laws. Just as we cannot defeat the law of gravity, so we cannot think that 2 + 2 = 5, or that a triangle has four sides.

Spinoza's criticism of the popular belief in free will is rather similar to his analysis of belief in miracles in the Theologico-Political Treatise, which we looked at a few weeks ago. There, we may recall, he argued that people regard events as miraculous and supernatural when they are ignorant of their natural causes. Likewise, human actions are attributed to free will when their causes are unknown: "That human freedom which all men boast of possessing … consists solely in this, that men are conscious of their desire and unaware of the causes by which they are determined." For Spinoza, belief in free will is just as much a sign of ignorance and superstition as belief in miracles worked by divine intervention.