Search This Blog

Sunday, 19 May 2013

Daniel Dennett's seven tools for thinking



Cognitive scientist and philosopher Daniel Dennett is one of America's foremost thinkers. In this extract from his new book, he reveals some of the lessons life has taught him
dennett
Daniel Dennett: 'Often the word "surely" is as good as a blinking light locating a weak point in the argument.' Photograph: Peter Yang/August

1 USE YOUR MISTAKES

We have all heard the forlorn refrain: "Well, it seemed like a good idea at the time!" This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say: "Well, it seemed like a good idea at the time!" is standing on the threshold of brilliance. We human beings pride ourselves on our intelligence, and one of its hallmarks is that we can remember our previous thinking and reflect on it – on how it seemed, on why it was tempting in the first place and then about what went wrong.
  1. Intuition Pumps and Other Tools for Thinking
  2. by Daniel C Dennett
  1. Tell us what you think: Rate the book
I know of no evidence to suggest that any other species on the planet can actually think this thought. If they could, they would be almost as smart as we are. So when you make a mistake, you should learn to take a deep breath, grit your teeth and then examine your own recollections of the mistake as ruthlessly and as dispassionately as you can manage. It's not easy. The natural human reaction to making a mistake is embarrassment and anger (we are never angrier than when we are angry at ourselves) and you have to work hard to overcome these emotional reactions.
Try to acquire the weird practice of savouring your mistakes, delighting in uncovering the strange quirks that led you astray. Then, once you have sucked out all the goodness to be gained from having made them, you can cheerfully set them behind you and go on to the next big opportunity. But that is not enough: you should actively seek out opportunities just so you can then recover from them.
In science, you make your mistakes in public. You show them off so that everybody can learn from them. This way, you get the benefit of everybody else's experience, and not just your own idiosyncratic path through the space of mistakes. (Physicist Wolfgang Pauli famously expressed his contempt for the work of a colleague as "not even wrong". A clear falsehood shared with critics is better than vague mush.)
This, by the way, is another reason why we humans are so much smarter than every other species. It is not so much that our brains are bigger or more powerful, or even that we have the knack of reflecting on our own past errors, but that we share the benefits our individual brains have won by their individual histories of trial and error.
I am amazed at how many really smart people don't understand that you can make big mistakes in public and emerge none the worse for it. I know distinguished researchers who will go to preposterous lengths to avoid having to acknowledge that they were wrong about something. Actually, people love it when somebody admits to making a mistake. All kinds of people love pointing out mistakes.
Generous-spirited people appreciate your giving them the opportunity to help, and acknowledging it when they succeed in helping you; mean-spirited people enjoy showing you up. Let them! Either way we all win.

RESPECT YOUR OPPONENT

Just how charitable are you supposed to be when criticising the views of an opponent? If there are obvious contradictions in the opponent's case, then you should point them out, forcefully. If there are somewhat hidden contradictions, you should carefully expose them to view – and then dump on them. But the search for hidden contradictions often crosses the line into nitpicking, sea-lawyering and outright parody. The thrill of the chase and the conviction that your opponent has to be harbouring a confusion somewhere encourages uncharitable interpretation, which gives you an easy target to attack.
But such easy targets are typically irrelevant to the real issues at stake and simply waste everybody's time and patience, even if they give amusement to your supporters. The best antidote I know for this tendency to caricature one's opponent is a list of rules promulgated many years ago by social psychologist and game theorist Anatol Rapoport.
How to compose a successful critical commentary:
1. Attempt to re-express your target's position so clearly, vividly and fairly that your target says: "Thanks, I wish I'd thought of putting it that way."
2. List any points of agreement (especially if they are not matters of general or widespread agreement).
3. Mention anything you have learned from your target.
4. Only then are you permitted to say so much as a word of rebuttal or criticism.
One immediate effect of following these rules is that your targets will be a receptive audience for your criticism: you have already shown that you understand their positions as well as they do, and have demonstrated good judgment (you agree with them on some important matters and have even been persuaded by something they said). Following Rapoport's rules is always, for me, something of a struggle…

THE "SURELY" KLAXON

When you're reading or skimming argumentative essays, especially by philosophers, here is a quick trick that may save you much time and effort, especially in this age of simple searching by computer: look for "surely" in the document and check each occurrence. Not always, not even most of the time, but often the word "surely" is as good as a blinking light locating a weak point in the argument.
Why? Because it marks the very edge of what the author is actually sure about and hopes readers will also be sure about. (If the author were really sure all the readers would agree, it wouldn't be worth mentioning.) Being at the edge, the author has had to make a judgment call about whether or not to attempt to demonstrate the point at issue, or provide evidence for it, and – because life is short – has decided in favour of bald assertion, with the presumably well-grounded anticipation of agreement. Just the sort of place to find an ill-examined "truism" that isn't true!

ANSWER RHETORICAL QUESTIONS

Just as you should keep a sharp eye out for "surely", you should develop a sensitivity for rhetorical questions in any argument or polemic. Why? Because, like the use of "surely", they represent an author's eagerness to take a short cut. A rhetorical question has a question mark at the end, but it is not meant to be answered. That is, the author doesn't bother waiting for you to answer since the answer is so obvious that you'd be embarrassed to say it!
Here is a good habit to develop: whenever you see a rhetorical question, try – silently, to yourself – to give it an unobvious answer. If you find a good one, surprise your interlocutor by answering the question. I remember a Peanuts cartoon from years ago that nicely illustrates the tactic. Charlie Brown had just asked, rhetorically: "Who's to say what is right and wrong here?" and Lucy responded, in the next panel: "I will."

EMPLOY OCCAM'S RAZOR

Attributed to William of Ockham (or Ooccam), a 14th-century English logician and philosopher, this thinking tool is actually a much older rule of thumb. A Latin name for it is lex parsimoniae, the law of parsimony. It is usually put into English as the maxim "Do not multiply entities beyond necessity".
The idea is straightforward: don't concoct a complicated, extravagant theory if you've got a simpler one (containing fewer ingredients, fewer entities) that handles the phenomenon just as well. If exposure to extremely cold air can account for all the symptoms of frostbite, don't postulate unobserved "snow germs" or "Arctic microbes". Kepler's laws explain the orbits of the planets; we have no need to hypothesise pilots guiding the planets from control panels hidden under the surface. This much is uncontroversial, but extensions of the principle have not always met with agreement.
One of the least impressive attempts to apply Occam's razor to a gnarly problem is the claim (and provoked counterclaims) that postulating a God as creator of the universe is simpler, more parsimonious, than the alternatives. How could postulating something supernatural and incomprehensible be parsimonious? It strikes me as the height of extravagance, but perhaps there are clever ways of rebutting that suggestion.
I don't want to argue about it; Occam's razor is, after all, just a rule of thumb, a frequently useful suggestion. The prospect of turning it into a metaphysical principle or fundamental requirement of rationality that could bear the weight of proving or disproving the existence of God in one fell swoop is simply ludicrous. It would be like trying to disprove a theorem of quantum mechanics by showing that it contradicted the axiom "Don't put all your eggs in one basket".

DON'T WASTE YOUR TIME ON RUBBISH

Sturgeon's law is usually expressed thus: 90% of everything is crap. So 90% of experiments in molecular biology, 90% of poetry, 90% of philosophy books, 90% of peer-reviewed articles in mathematics – and so forth – is crap. Is that true? Well, maybe it's an exaggeration, but let's agree that there is a lot of mediocre work done in every field. (Some curmudgeons say it's more like 99%, but let's not get into that game.)
A good moral to draw from this observation is that when you want to criticise a field, a genre, a discipline, an art form …don't waste your time and ours hooting at the crap! Go after the good stuff or leave it alone. This advice is often ignored by ideologues intent on destroying the reputation of analytic philosophy, sociology, cultural anthropology, macroeconomics, plastic surgery, improvisational theatre, television sitcoms, philosophical theology, massage therapy, you name it.
Let's stipulate at the outset that there is a great deal of deplorable, second-rate stuff out there, of all sorts. Now, in order not to waste your time and try our patience, make sure you concentrate on the best stuff you can find, the flagship examples extolled by the leaders of the field, the prize-winning entries, not the dregs. Notice that this is closely related to Rapoport's rules: unless you are a comedian whose main purpose is to make people laugh at ludicrous buffoonery, spare us the caricature.

BEWARE OF DEEPITIES

A deepity (a term coined by the daughter of my late friend, computer scientist Joseph Weizenbaum) is a proposition that seems both important and true – and profound – but that achieves this effect by being ambiguous. On one reading, it is manifestly false, but it would be earth-shaking if it were true; on the other reading, it is true but trivial. The unwary listener picks up the glimmer of truth from the second reading, and the devastating importance from the first reading, and thinks, Wow! That's a deepity.
Here is an example (better sit down: this is heavy stuff): Love is just a word.
Oh wow! Cosmic. Mind-blowing, right? Wrong. On one reading, it is manifestly false. I'm not sure what love is – maybe an emotion or emotional attachment, maybe an interpersonal relationship, maybe the highest state a human mind can achieve – but we all know it isn't a word. You can't find love in the dictionary!
We can bring out the other reading by availing ourselves of a convention philosophers care mightily about: when we talk about a word, we put it in quotation marks, thus: "love" is just a word. "Cheeseburger" is just a word. "Word" is just a word. But this isn't fair, you say. Whoever said that love is just a word meant something else, surely. No doubt, but they didn't say it.
Not all deepities are quite so easily analysed. Richard Dawkins recently alerted me to a fine deepity by Rowan Williams, the then archbishop of Canterbury, who described his faith as "a silent waiting on the truth, pure sitting and breathing in the presence of the question mark".
I leave the analysis of this as an exercise for you.

Exit Europe from the left



Most Britons dislike the European Union. If trade unions don't articulate their concerns, the hard right will
pan-European protest to demand better job protection in Brussels
A protest in Brussels. 'Millions of personal tragedies of lost homes, jobs, pensions and services are testament to the sick joke of 'social Europe'.' Photograph: Thierry Roge/Reuters
For years the electorate has overwhelmingly opposed Britain's membership of the European Union – particularly those who work for a living. Yet while movements in other countries that are critical of the EU are led by the left, in Britain they are dominated by the hard right, and working-class concerns are largely ignored.
This is particularly strange when you consider that the EU is largely a Tory neoliberal project. Not only did the Conservative prime minister Edward Heath take Britain into the common market in 1973, but Margaret Thatcher campaigned to stay in it in the 1975 referendum, and was one of the architects of the Single European Act – which gave us the single market, EU militarisation and eventually the struggling euro.
After the Tories dumped the born-again Eurosceptic Thatcher, John Major rammed through the Maastricht treaty and embarked on the disastrous privatisation of our railways using EU directives – a model now set to be rolled out across the continent.
Even now, the majority of David Cameron's Tories will campaign for staying in the EU if we do get the referendum the electorate so clearly wants. And most of the left seems to be lining up alongside them. My union stood in the last European elections under the No2EU-Yes to Democracy coalition, which set out to give working people a voice that had been denied them by the political establishment. We also set out to challenge the rancid politics of the racist British National party, yet the BNP received far more media coverage. Today it is Ukip that is enjoying the media spotlight. Its rightwing Thatcherite rhetoric and assorted cranky hobby horses are a gift to a political establishment that seeks to project a narrow agenda of continued EU membership.
But the reality is that Ukip supports the EU agenda of privatisation, cuts and austerity. Nigel Farage's only problem with this government's assault on our public services is that it doesn't go far enough. Ukip opposes the renationalisation of our rail network as much as any Eurocrat. Yet Ukip has filled the political vacuum created when the Labour party and parts of the trade union movement adopted the position of EU cheerleaders, believing in the myth of "social Europe".
Social EU legislation, which supposedly leads to better working conditions, has not saved one job and is riddled with opt-outs for employers to largely ignore any perceived benefits they may bring to workers. But it is making zero-hour contracts and agency-working the norm while undermining collective bargaining and full-time, secure employment. Meanwhile, 10,000 manufacturing jobs in the East Midlands still hang in the balance because EU law demanded that the crucial Thameslink contract go to Siemens in Germany rather than Bombardier in Derby.
Today, unemployment in the eurozone is at a record 12%. In the countries hit hardest by the "troika" of banks and bureaucrats, youth unemployment tops 60% and the millions of personal tragedies of lost homes, jobs, pensions and services are testament to the sick joke of "social Europe".
The raft of EU treaties are, as Tony Benn once said, nothing more than a cast-iron manifesto for capitalism that demands the chaos of the complete free movement of capital, goods, services and labour. It is clear that Greece, Spain, Cyprus and the rest need investment, not more austerity and savage cuts to essential public services, but, locked in the eurozone, the only option left is exactly that.
What's more, the EU sees the current crisis as an opportunity to speed up its privatisation drive. Mass unemployment and economic decline is a price worth paying in order to impose structural adjustment in favour of monopoly capitalism.
In Britain and across the EU, healthcare, education and every other public service face the same business model of privatisation and fragmentation. Indeed, the clause in the Health and Social Care Act demanding privatisation of every aspect of our NHS was defended by the Lib Dems on the basis of EU competition law.
But governments do not have to carry out such EU policies: they could carry out measures on behalf of those who elect them. That means having democratic control over capital flows, our borders and the future of our economy for the benefit of everyone.
The only rational course to take is to leave the EU so that elected governments regain the democratic power to decide matters on behalf of the people they serve.

It's time for a global companies to pay a Global Profit Tax


Ben Chu

The cascade of revelations in recent months showing multinational companies doing a huge amount of business here and yet paying virtually no corporation tax has provoked widespread public demands for something to be done. But people tend to be rather hazier on what that "something" should be.

To define a solution we first need to grasp the nature of the problem: a global tax loophole. In our age of liberalised cross-border trade and free capital flows, multinational companies find themselves with a considerable level of freedom to choose where they pay tax on profits.

With some sophisticated planning from their accountants, many of these corporations (especially those whose commercial value is derived from a piece of intangible intellectual property such as a search engine algorithm or a drug patent) are able to register their profits in tax havens.

Here's how it works. A multinational typically registers its intellectual property in a subsidiary company based somewhere like Bermuda or the Cayman Islands. This subsidiary then charges another subsidiary operating in a big customer market, such as Britain, a massive fee for the right to use that intellectual property. So any trading surplus resulting from activities in the large market is offset by the cost of the fee. And then the profits accumulate in the tax haven.

National governments could and should try to put a stop to this egregious "profit shifting" on their own. But a unilateral approach is plainly second best.

The natural solution is to secure an agreement by all the world's governments to tax the profits of multinational firms collectively and to divide up the revenues fairly between them. This division could be based on the amount of business done by the multinational in their various territories as revealed by their turnover and number of employees.

It sounds complicated, but American states have long operated a system designed along these lines known as "apportionment". Another name used is "unitary taxation". Those names are a bit of a turn-off to the layperson. What's required is a reform banner that the general public can easily understand. I suggest: "Global Profit Tax". After all, doesn't it make sense that global companies should be compelled to pay global taxes?

Saturday, 18 May 2013

In truth bosses want cheap labour - People are told EU migrants steal jobs


 

The Conservatives are determined to be seen as the anti-Europe party, but an EU referendum that took Britain out of the union would be a disaster for the party
England - cliffs of Dover
Island nation … leaving the EU won’t isolate the UK. It will isolate England. Photograph: David Parry / PA Wire
Having a referendum on membership of the EU is a bit like having a referendum on membership of the moon's gravitational pull. You can vote to leave it all you like, but it will still be there, exerting the natural influence of its mass. Even China has EU regulations on its statute book, because it needs them to trade with Europe. The best that can be said of a possible withdrawal is that at last Westminster will have only itself to blame. Oh, and of course there will be an end to the regular convulsions of drama over the possibility of having a referendum on membership of the EU. Which admittedly does sound nice.
The poor old Tories – Europe drives them so bonkers. They're like cartoon characters whose eyes turn into pound-signs, except their pupils are shaped like crosses, for votes. The Conservatives are keen to be seen as the anti-Europe party. But Ukip has stolen their thunder. This is a disaster for the Tories for two reasons.
First, it destroys a carefully cultivated Tory image, whereby they can make tough-looking gestures to play to the grassroots.
Second, it destroys the second most important electoral advantage the Conservatives have left (the most important being the first-past-the-post voting system). The coalition has weakened the left's long-standing electoral problem, which was that the leftish vote was split while the rightish vote was a one-stop shop. Ukip has provided a protest vote for disenchanted Tories, just as – up until the moment when David Cameron promised Nick Clegg a rose garden – the Lib Dems provided an alternative to Labour. Now, they are more likely, if anything, to provide another alternative to the Conservatives. Oh, the irony.
Beyond party politics, however, there is not much logic in Conservative Europhobia. In fact, it runs contrary to many of the Conservatives' other long-cherished beliefs. How can people who were so against devolution for the UK's member states be so determinedly in favour of devolving away from Brussels? That's an easy one, isn't it? Devolution within the UK takes power away from Westminster, while leaving the EU will, the poor darlings imagine, give it more. But Scotland will want to stay in Europe, as Nigel Farage's short shrift in Edinburgh this week demonstrated. Wales will want to stay in Europe. Northern Ireland will want to stay in Europe. Withdrawal from the EU won't isolate the UK. It will isolate England, making lukewarm support for full independence, especially in Scotland, a great deal more attractive. The Conservatives, despite their interminable resentment of Europe, really haven't thought this through.
More intractable is the Conservatives' supposed commitment to globalisation and free trade, and supposed horror of protectionism and restrictive practices. Europe, for all its reputation as some kind of dastardly machine for the promotion of crypto-communism, is really just a hothouse environment in which the promised fruits of neoliberalism are forced into ripening more quickly. Whether or not it was right to huddle under the glass with so much of the rest of the continent (and at the risk of labouring a metaphor to death), the process of hardening off out in the global garden is likely to kill a few tubers.
Not Conservative tubers, though. The most deep hypocrisy of the right is seen in its attitude to immigration. The Conservatives are keen to promote themselves as the anti-immigration party, and shake their heads in disgust over the mass immigration that took place under Blair and Brown. However, Labour policy on immigration dates back to the "prawn cocktail offensive", under which New Labour persuaded the City of London that it would look after its interests. Look after them, Labour did, not only turning a blind eye to all kinds of tax dodges, but also obliging the Confederation of British Industry and the Institute of Directors, both of which are institutions stuffed with Tories whose political views took a poor second place to their passion for keeping wages down. Were the Tories to manage to get a referendum on Europe, win it, and put a curb on EU immigration, then, yes, there would be British jobs for British workers, probably alongside a nice non-EU regulation setting the minimum wage at the same level as universal benefit in order to make employing someone pay. People are told that immigrants stole their jobs. In truth, it was employers who wanted a ready supply of workers unused to the living conditions that it took the second world war for the ordinary people of Britain to achieve. The goal of neoliberal globalisation is supposedly a redistribution of wealth around the planet. It also, as the EU itself is discovering, redistributes poverty.
There can be no doubt that the EU is not an entirely successful experiment. It most definitely went too far, too fast. Certainly, there can be few people in Britain who are not now relieved to be outside the eurozone. But, even within Britain one can see the trouble with having disparate parts of the country, with disparate economic needs, all dancing to the same economic tune. Only too well.
The truth is that what's needed is for devolved and local government to be strengthened, and given more fiscal powers. But although the Conservatives like to proclaim their hatred of centralised and distant government, they are not too keen on that. Again, of course, it's all about power. If local government were to become more powerful, then Westminster would find itself either the government of the home counties or simply a mini-EU, passing legislation that allowed the regions of Britain to trade fairly and equally; legislation that would no doubt look uncannily similar to EU legislation. Because it's not the EU that is an extra layer of government that no one really needs – it's Westminster. The European parliament is an institution with a democratic deficit precisely because it exists only to enact what the heads of member states have agreed. Local government in Britain is similarly hampered by the directives of Westminster. Across Europe, national governments are struggling against the advent of their own irrelevance, desperate to stop the leak of any more power either above or below, even as countries fall to government by technocrat. The nation state itself is in crisis, and the denizens of Westminster are the people least likely to see or accept that.
A Britain outside Europe would be governed by multinationals, who would be attracted by low taxes and a population compelled to work, however disabled or ill or elderly they may be. Of course, the Conservatives are keen on a referendum. But they fail to understand that if they got their way, it would be a pyrrhic victory. All those who believe that mass immigration was some sort of politically correct leftwing conspiracy would soon get wise to the fact that they'd been had. In the end, if the Conservatives got their wish, and took Britain out of Europe, they'd be finished.

How the Case for Austerity Has Crumbled


 
 
The Alchemists: Three Central Bankers and a World on Fire
by Neil Irwin
Penguin, 430 pp., $29.95                                                  
Austerity: The History of a Dangerous Idea
by Mark Blyth
Oxford University Press, 288 pp., $24.95                                                  
The Great Deformation: The Corruption of Capitalism in America
by David A. Stockman
PublicAffairs, 742 pp., $35.00                                                  
krugman_1-060613
President Barack Obama and Representative Paul Ryan at a bipartisan meeting on health insurance reform, Washington, D.C., February 2010
In normal times, an arithmetic mistake in an economics paper would be a complete nonevent as far as the wider world was concerned. But in April 2013, the discovery of such a mistake—actually, a coding error in a spreadsheet, coupled with several other flaws in the analysis—not only became the talk of the economics profession, but made headlines. Looking back, we might even conclude that it changed the course of policy.
Why? Because the paper in question, “Growth in a Time of Debt,” by the Harvard economists Carmen Reinhart and Kenneth Rogoff, had acquired touchstone status in the debate over economic policy. Ever since the paper was first circulated, austerians—advocates of fiscal austerity, of immediate sharp cuts in government spending—had cited its alleged findings to defend their position and attack their critics. Again and again, suggestions that, as John Maynard Keynes once argued, “the boom, not the slump, is the right time for austerity”—that cuts should wait until economies were stronger—were met with declarations that Reinhart and Rogoff had shown that waiting would be disastrous, that economies fall off a cliff once government debt exceeds 90 percent of GDP.
Indeed, Reinhart-Rogoff may have had more immediate influence on public debate than any previous paper in the history of economics. The 90 percent claim was cited as the decisive argument for austerity by figures ranging from Paul Ryan, the former vice-presidential candidate who chairs the House budget committee, to Olli Rehn, the top economic official at the European Commission, to the editorial board of The Washington Post. So the revelation that the supposed 90 percent threshold was an artifact of programming mistakes, data omissions, and peculiar statistical techniques suddenly made a remarkable number of prominent people look foolish.
The real mystery, however, was why Reinhart-Rogoff was ever taken seriously, let alone canonized, in the first place. Right from the beginning, critics raised strong concerns about the paper’s methodology and conclusions, concerns that should have been enough to give everyone pause. Moreover, Reinhart-Rogoff was actually the second example of a paper seized on as decisive evidence in favor of austerity economics, only to fall apart on careful scrutiny. Much the same thing happened, albeit less spectacularly, after austerians became infatuated with a paper by Alberto Alesina and Silvia Ardagna purporting to show that slashing government spending would have little adverse impact on economic growth and might even be expansionary. Surely that experience should have inspired some caution.
So why wasn’t there more caution? The answer, as documented by some of the books reviewed here and unintentionally illustrated by others, lies in both politics and psychology: the case for austerity was and is one that many powerful people want to believe, leading them to seize on anything that looks like a justification. I’ll talk about that will to believe later in this article. First, however, it’s useful to trace the recent history of austerity both as a doctrine and as a policy experiment.

1.

In the beginning was the bubble. There have been many, many books about the excesses of the boom years—in fact, too many books. For as we’ll see, the urge to dwell on the lurid details of the boom, rather than trying to understand the dynamics of the slump, is a recurrent problem for economics and economic policy. For now, suffice it to say that by the beginning of 2008 both America and Europe were poised for a fall. They had become excessively dependent on an overheated housing market, their households were too deep in debt, their financial sectors were undercapitalized and overextended.
All that was needed to collapse these houses of cards was some kind of adverse shock, and in the end the implosion of US subprime-based securities did the deed. By the fall of 2008 the housing bubbles on both sides of the Atlantic had burst, and the whole North Atlantic economy was caught up in “deleveraging,” a process in which many debtors try—or are forced—to pay down their debts at the same time.
Why is this a problem? Because of interdependence: your spending is my income, and my spending is your income. If both of us try to reduce our debt by slashing spending, both of our incomes plunge—and plunging incomes can actually make our indebtedness worse even as they also produce mass unemployment.
Students of economic history watched the process unfolding in 2008 and 2009 with a cold shiver of recognition, because it was very obviously the same kind of process that brought on the Great Depression. Indeed, early in 2009 the economic historians Barry Eichengreen and Kevin O’Rourke produced shocking charts showing that the first year of the 2008–2009 slump in trade and industrial production was fully comparable to the first year of the great global slump from 1929 to 1933.
So was a second Great Depression about to unfold? The good news was that we had, or thought we had, several big advantages over our grandfathers, helping to limit the damage. Some of these advantages were, you might say, structural, built into the way modern economies operate, and requiring no special action on the part of policymakers. Others were intellectual: surely we had learned something since the 1930s, and would not repeat our grandfathers’ policy mistakes.
On the structural side, probably the biggest advantage over the 1930s was the way taxes and social insurance programs—both much bigger than they were in 1929—acted as “automatic stabilizers.” Wages might fall, but overall income didn’t fall in proportion, both because tax collections plunged and because government checks continued to flow for Social Security, Medicare, unemployment benefits, and more. In effect, the existence of the modern welfare state put a floor on total spending, and therefore prevented the economy’s downward spiral from going too far.
On the intellectual side, modern policymakers knew the history of the Great Depression as a cautionary tale; some, including Ben Bernanke, had actually been major Depression scholars in their previous lives. They had learned from Milton Friedman the folly of letting bank runs collapse the financial system and the desirability of flooding the economy with money in times of panic. They had learned from John Maynard Keynes that under depression conditions government spending can be an effective way to create jobs. They had learned from FDR’s disastrous turn toward austerity in 1937 that abandoning monetary and fiscal stimulus too soon can be a very big mistake.
As a result, where the onset of the Great Depression was accompanied by policies that intensified the slump—interest rate hikes in an attempt to hold on to gold reserves, spending cuts in an attempt to balance budgets—2008 and 2009 were characterized by expansionary monetary and fiscal policies, especially in the United States, where the Federal Reserve not only slashed interest rates, but stepped into the markets to buy everything from commercial paper to long-term government debt, while the Obama administration pushed through an $800 billion program of tax cuts and spending increases. European actions were less dramatic—but on the other hand, Europe’s stronger welfare states arguably reduced the need for deliberate stimulus.
Now, some economists (myself included) warned from the beginning that these monetary and fiscal actions, although welcome, were too small given the severity of the economic shock. Indeed, by the end of 2009 it was clear that although the situation had stabilized, the economic crisis was deeper than policymakers had acknowledged, and likely to prove more persistent than they had imagined. So one might have expected a second round of stimulus to deal with the economic shortfall.
What actually happened, however, was a sudden reversal.

2.

Neil Irwin’s The Alchemists gives us a time and a place at which the major advanced countries abruptly pivoted from stimulus to austerity. The time was early February 2010; the place, somewhat bizarrely, was the remote Canadian Arctic settlement of Iqaluit, where the Group of Seven finance ministers held one of their regularly scheduled summits. Sometimes (often) such summits are little more than ceremonial occasions, and there was plenty of ceremony at this one too, including raw seal meat served at the last dinner (the foreign visitors all declined). But this time something substantive happened. “In the isolation of the Canadian wilderness,” Irwin writes, “the leaders of the world economy collectively agreed that their great challenge had shifted. The economy seemed to be healing; it was time for them to turn their attention away from boosting growth. No more stimulus.”
krugman_figure1-060613
How decisive was the turn in policy? Figure 1, which is taken from the IMF’s most recent World Economic Outlook, shows how real government spending behaved in this crisis compared with previous recessions; in the figure, year zero is the year before global recession (2007 in the current slump), and spending is compared with its level in that base year. What you see is that the widespread belief that we are experiencing runaway government spending is false—on the contrary, after a brief surge in 2009, government spending began falling in both Europe and the United States, and is now well below its normal trend. The turn to austerity was very real, and quite large.
On the face of it, this was a very strange turn for policy to take. Standard textbook economics says that slashing government spending reduces overall demand, which leads in turn to reduced output and employment. This may be a desirable thing if the economy is overheating and inflation is rising; alternatively, the adverse effects of reduced government spending can be offset. Central banks (the Fed, the European Central Bank, or their counterparts elsewhere) can cut interest rates, inducing more private spending. However, neither of these conditions applied in early 2010, or for that matter apply now. The major advanced economies were and are deeply depressed, with no hint of inflationary pressure. Meanwhile, short-term interest rates, which are more or less under the central bank’s control, are near zero, leaving little room for monetary policy to offset reduced government spending. So Economics 101 would seem to say that all the austerity we’ve seen is very premature, that it should wait until the economy is stronger.
The question, then, is why economic leaders were so ready to throw the textbook out the window.
One answer is that many of them never believed in that textbook stuff in the first place. The German political and intellectual establishment has never had much use for Keynesian economics; neither has much of the Republican Party in the United States. In the heat of an acute economic crisis—as in the autumn of 2008 and the winter of 2009—these dissenting voices could to some extent be shouted down; but once things had calmed they began pushing back hard.
A larger answer is the one we’ll get to later: the underlying political and psychological reasons why many influential figures hate the notions of deficit spending and easy money. Again, once the crisis became less acute, there was more room to indulge in these sentiments.
In addition to these underlying factors, however, were two more contingent aspects of the situation in early 2010: the new crisis in Greece, and the appearance of seemingly rigorous, high-quality economic research that supported the austerian position.
The Greek crisis came as a shock to almost everyone, not least the new Greek government that took office in October 2009. The incoming leadership knew it faced a budget deficit—but it was only after arriving that it learned that the previous government had been cooking the books, and that both the deficit and the accumulated stock of debt were far higher than anyone imagined. As the news sank in with investors, first Greece, then much of Europe, found itself in a new kind of crisis—one not of failing banks but of failing governments, unable to borrow on world markets.
It’s an ill wind that blows nobody good, and the Greek crisis was a godsend for anti-Keynesians. They had been warning about the dangers of deficit spending; the Greek debacle seemed to show just how dangerous fiscal profligacy can be. To this day, anyone arguing against fiscal austerity, let alone suggesting that we need another round of stimulus, can expect to be attacked as someone who will turn America (or Britain, as the case may be) into another Greece.
If Greece provided the obvious real-world cautionary tale, Reinhart and Rogoff seemed to provide the math. Their paper seemed to show not just that debt hurts growth, but that there is a “threshold,” a sort of trigger point, when debt crosses 90 percent of GDP. Go beyond that point, their numbers suggested, and economic growth stalls. Greece, of course, already had debt greater than the magic number. More to the point, major advanced countries, the United States included, were running large budget deficits and closing in on the threshold. Put Greece and Reinhart-Rogoff together, and there seemed to be a compelling case for a sharp, immediate turn toward austerity.
But wouldn’t such a turn toward austerity in an economy still depressed by private deleveraging have an immediate negative impact? Not to worry, said another remarkably influential academic paper, “Large Changes in Fiscal Policy: Taxes Versus Spending,” by Alberto Alesina and Silvia Ardagna.
One of the especially good things in Mark Blyth’s Austerity: The History of a Dangerous Idea is the way he traces the rise and fall of the idea of “expansionary austerity,” the proposition that cutting spending would actually lead to higher output. As he shows, this is very much a proposition associated with a group of Italian economists (whom he dubs “the Bocconi boys”) who made their case with a series of papers that grew more strident and less qualified over time, culminating in the 2009 analysis by Alesina and Ardagna.
In essence, Alesina and Ardagna made a full frontal assault on the Keynesian proposition that cutting spending in a weak economy produces further weakness. Like Reinhart and Rogoff, they marshaled historical evidence to make their case. According to Alesina and Ardagna, large spending cuts in advanced countries were, on average, followed by expansion rather than contraction. The reason, they suggested, was that decisive fiscal austerity created confidence in the private sector, and this increased confidence more than offset any direct drag from smaller government outlays.
As Mark Blyth documents, this idea spread like wildfire. Alesina and Ardagna made a special presentation in April 2010 to the Economic and Financial Affairs Council of the European Council of Ministers; the analysis quickly made its way into official pronouncements from the European Commission and the European Central Bank. Thus in June 2010 Jean-Claude Trichet, the then president of theECB, dismissed concerns that austerity might hurt growth:
As regards the economy, the idea that austerity measures could trigger stagnation is incorrect…. In fact, in these circumstances, everything that helps to increase the confidence of households, firms and investors in the sustainability of public finances is good for the consolidation of growth and job creation. I firmly believe that in the current circumstances confidence-inspiring policies will foster and not hamper economic recovery, because confidence is the key factor today.
This was straight Alesina-Ardagna.
By the summer of 2010, then, a full-fledged austerity orthodoxy had taken shape, becoming dominant in European policy circles and influential on this side of the Atlantic. So how have things gone in the almost three years that have passed since?

3.

Clear evidence on the effects of economic policy is usually hard to come by. Governments generally change policies reluctantly, and it’s hard to distinguish the effects of the half-measures they undertake from all the other things going on in the world. The Obama stimulus, for example, was both temporary and fairly small compared with the size of the US economy, never amounting to much more than 2 percent of GDP, and it took effect in an economy whipsawed by the biggest financial crisis in three generations. How much of what took place in 2009–2011, good or bad, can be attributed to the stimulus? Nobody really knows.
The turn to austerity after 2010, however, was so drastic, particularly in European debtor nations, that the usual cautions lose most of their force. Greece imposed spending cuts and tax increases amounting to 15 percent of GDP; Ireland and Portugal rang in with around 6 percent; and unlike the half-hearted efforts at stimulus, these cuts were sustained and indeed intensified year after year. So how did austerity actually work?
krugman_figure2-060613
The answer is that the results were disastrous—just about as one would have predicted from textbook macroeconomics. Figure 2, for example, shows what happened to a selection of European nations (each represented by a diamond-shaped symbol). The horizontal axis shows austerity measures—spending cuts and tax increases—as a share of GDP, as estimated by the International Monetary Fund. The vertical axis shows the actual percentage change in real GDP. As you can see, the countries forced into severe austerity experienced very severe downturns, and the downturns were more or less proportional to the degree of austerity.
There have been some attempts to explain away these results, notably at the European Commission. But the IMF, looking hard at the data, has not only concluded that austerity has had major adverse economic effects, it has issued what amounts to a mea culpa for having underestimated these adverse effects.*
But is there any alternative to austerity? What about the risks of excessive debt?
In early 2010, with the Greek disaster fresh in everyone’s mind, the risks of excessive debt seemed obvious; those risks seemed even greater by 2011, as Ireland, Spain, Portugal, and Italy joined the ranks of nations having to pay large interest rate premiums. But a funny thing happened to other countries with high debt levels, including Japan, the United States, and Britain: despite large deficits and rapidly rising debt, their borrowing costs remained very low. The crucial difference, as the Belgian economist Paul DeGrauwe pointed out, seemed to be whether countries had their own currencies, and borrowed in those currencies. Such countries can’t run out of money because they can print it if needed, and absent the risk of a cash squeeze, advanced nations are evidently able to carry quite high levels of debt without crisis.
Three years after the turn to austerity, then, both the hopes and the fears of the austerians appear to have been misplaced. Austerity did not lead to a surge in confidence; deficits did not lead to crisis. But wasn’t the austerity movement grounded in serious economic research? Actually, it turned out that it wasn’t—the research the austerians cited was deeply flawed.
First to go down was the notion of expansionary austerity. Even before the results of Europe’s austerity experiment were in, the Alesina-Ardagna paper was falling apart under scrutiny. Researchers at the Roosevelt Institute pointed out that none of the alleged examples of austerity leading to expansion of the economy actually took place in the midst of an economic slump; researchers at the IMF found that the Alesina-Ardagna measure of fiscal policy bore little relationship to actual policy changes. “By the middle of 2011,” Blyth writes, “empirical and theoretical support for expansionary austerity was slipping away.” Slowly, with little fanfare, the whole notion that austerity might actually boost economies slunk off the public stage.
Reinhart-Rogoff lasted longer, even though serious questions about their work were raised early on. As early as July 2010 Josh Bivens and John Irons of the Economic Policy Institute had identified both a clear mistake—a misinterpretation of US data immediately after World War II—and a severe conceptual problem. Reinhart and Rogoff, as they pointed out, offered no evidence that the correlation ran from high debt to low growth rather than the other way around, and other evidence suggested that the latter was more likely. But such criticisms had little impact; for austerians, one might say, Reinhart-Rogoff was a story too good to check.
So the revelations in April 2013 of the errors of Reinhart and Rogoff came as a shock. Despite their paper’s influence, Reinhart and Rogoff had not made their data widely available—and researchers working with seemingly comparable data hadn’t been able to reproduce their results. Finally, they made their spreadsheet available to Thomas Herndon, a graduate student at the University of Massachusetts, Amherst—and he found it very odd indeed. There was one actual coding error, although that made only a small contribution to their conclusions. More important, their data set failed to include the experience of several Allied nations—Canada, New Zealand, and Australia—that emerged from World War II with high debt but nonetheless posted solid growth. And they had used an odd weighting scheme in which each “episode” of high debt counted the same, whether it occurred during one year of bad growth or seventeen years of good growth.
Without these errors and oddities, there was still a negative correlation between debt and growth—but this could be, and probably was, mostly a matter of low growth leading to high debt, not the other way around. And the “threshold” at 90 percent vanished, undermining the scare stories being used to sell austerity.
Not surprisingly, Reinhart and Rogoff have tried to defend their work; but their responses have been weak at best, evasive at worst. Notably, they continue to write in a way that suggests, without stating outright, that debt at 90 percent ofGDP is some kind of threshold at which bad things happen. In reality, even if one ignores the issue of causality—whether low growth causes high debt or the other way around—the apparent effects on growth of debt rising from, say, 85 to 95 percent of GDP are fairly small, and don’t justify the debt panic that has been such a powerful influence on policy.
At this point, then, austerity economics is in a very bad way. Its predictions have proved utterly wrong; its founding academic documents haven’t just lost their canonized status, they’ve become the objects of much ridicule. But as I’ve pointed out, none of this (except that Excel error) should have come as a surprise: basic macroeconomics should have told everyone to expect what did, in fact, happen, and the papers that have now fallen into disrepute were obviously flawed from the start.
This raises the obvious question: Why did austerity economics get such a powerful grip on elite opinion in the first place?
krugman_2-060613

4.

Everyone loves a morality play. “For the wages of sin is death” is a much more satisfying message than “Shit happens.” We all want events to have meaning.
When applied to macroeconomics, this urge to find moral meaning creates in all of us a predisposition toward believing stories that attribute the pain of a slump to the excesses of the boom that precedes it—and, perhaps, also makes it natural to see the pain as necessary, part of an inevitable cleansing process. When Andrew Mellon told Herbert Hoover to let the Depression run its course, so as to “purge the rottenness” from the system, he was offering advice that, however bad it was as economics, resonated psychologically with many people (and still does).
By contrast, Keynesian economics rests fundamentally on the proposition that macroeconomics isn’t a morality play—that depressions are essentially a technical malfunction. As the Great Depression deepened, Keynes famously declared that “we have magneto trouble”—i.e., the economy’s troubles were like those of a car with a small but critical problem in its electrical system, and the job of the economist is to figure out how to repair that technical problem. Keynes’s masterwork, The General Theory of Employment, Interest and Money, is noteworthy—and revolutionary—for saying almost nothing about what happens in economic booms. Pre-Keynesian business cycle theorists loved to dwell on the lurid excesses that take place in good times, while having relatively little to say about exactly why these give rise to bad times or what you should do when they do. Keynes reversed this priority; almost all his focus was on how economies stay depressed, and what can be done to make them less depressed.
I’d argue that Keynes was overwhelmingly right in his approach, but there’s no question that it’s an approach many people find deeply unsatisfying as an emotional matter. And so we shouldn’t find it surprising that many popular interpretations of our current troubles return, whether the authors know it or not, to the instinctive, pre-Keynesian style of dwelling on the excesses of the boom rather than on the failures of the slump.
David Stockman’s The Great Deformation should be seen in this light. It’s an immensely long rant against excesses of various kinds, all of which, in Stockman’s vision, have culminated in our present crisis. History, to Stockman’s eyes, is a series of “sprees”: a “spree of unsustainable borrowing,” a “spree of interest rate repression,” a “spree of destructive financial engineering,” and, again and again, a “money-printing spree.” For in Stockman’s world, all economic evil stems from the original sin of leaving the gold standard. Any prosperity we may have thought we had since 1971, when Nixon abandoned the last link to gold, or maybe even since 1933, when FDR took us off gold for the first time, was an illusion doomed to end in tears. And of course, any policies aimed at alleviating the current slump will just make things worse.
In itself, Stockman’s book isn’t important. Aside from a few swipes at Republicans, it consists basically of standard goldbug bombast. But the attention the book has garnered, the ways it has struck a chord with many people, including even some liberals, suggest just how strong remains the urge to see economics as a morality play, three generations after Keynes tried to show us that it is nothing of the kind.
And powerful officials are by no means immune to that urge. In The Alchemists, Neil Irwin analyzes the motives of Jean-Claude Trichet, the president of the European Central Bank, in advocating harsh austerity policies:
Trichet embraced a view, especially common in Germany, that was rooted in a sort of moralism. Greece had spent too much and taken on too much debt. It must cut spending and reduce deficits. If it showed adequate courage and political resolve, markets would reward it with lower borrowing costs. He put a great deal of faith in the power of confidence….
Given this sort of predisposition, is it any wonder that Keynesian economics got thrown out the window, while Alesina-Ardagna and Reinhart-Rogoff were instantly canonized?
So is the austerian impulse all a matter of psychology? No, there’s also a fair bit of self-interest involved. As many observers have noted, the turn away from fiscal and monetary stimulus can be interpreted, if you like, as giving creditors priority over workers. Inflation and low interest rates are bad for creditors even if they promote job creation; slashing government deficits in the face of mass unemployment may deepen a depression, but it increases the certainty of bondholders that they’ll be repaid in full. I don’t think someone like Trichet was consciously, cynically serving class interests at the expense of overall welfare; but it certainly didn’t hurt that his sense of economic morality dovetailed so perfectly with the priorities of creditors.
It’s also worth noting that while economic policy since the financial crisis looks like a dismal failure by most measures, it hasn’t been so bad for the wealthy. Profits have recovered strongly even as unprecedented long-term unemployment persists; stock indices on both sides of the Atlantic have rebounded to pre-crisis highs even as median income languishes. It might be too much to say that those in the top 1 percent actually benefit from a continuing depression, but they certainly aren’t feeling much pain, and that probably has something to do with policymakers’ willingness to stay the austerity course.

5.

How could this happen? That’s the question many people were asking four years ago; it’s still the question many are asking today. But the “this” has changed.
Four years ago, the mystery was how such a terrible financial crisis could have taken place, with so little forewarning. The harsh lessons we had to learn involved the fragility of modern finance, the folly of trusting banks to regulate themselves, and the dangers of assuming that fancy financial arrangements have eliminated or even reduced the age-old problems of risk.
I would argue, however—self-serving as it may sound (I warned about the housing bubble, but had no inkling of how widespread a collapse would follow when it burst)—that the failure to anticipate the crisis was a relatively minor sin. Economies are complicated, ever-changing entities; it was understandable that few economists realized the extent to which short-term lending and securitization of assets such as subprime mortgages had recreated the old risks that deposit insurance and bank regulation were created to control.
I’d argue that what happened next—the way policymakers turned their back on practically everything economists had learned about how to deal with depressions, the way elite opinion seized on anything that could be used to justify austerity—was a much greater sin. The financial crisis of 2008 was a surprise, and happened very fast; but we’ve been stuck in a regime of slow growth and desperately high unemployment for years now. And during all that time policymakers have been ignoring the lessons of theory and history.
It’s a terrible story, mainly because of the immense suffering that has resulted from these policy errors. It’s also deeply worrying for those who like to believe that knowledge can make a positive difference in the world. To the extent that policymakers and elite opinion in general have made use of economic analysis at all, they have, as the saying goes, done so the way a drunkard uses a lamppost: for support, not illumination. Papers and economists who told the elite what it wanted to hear were celebrated, despite plenty of evidence that they were wrong; critics were ignored, no matter how often they got it right.
The Reinhart-Rogoff debacle has raised some hopes among the critics that logic and evidence are finally beginning to matter. But the truth is that it’s too soon to tell whether the grip of austerity economics on policy will relax significantly in the face of these revelations. For now, the broader message of the past few years remains just how little good comes from understanding.