Search This Blog

Showing posts with label time. Show all posts
Showing posts with label time. Show all posts

Monday 22 July 2013

Students need to make time for love, as well as for sex


The University of Pennsylvania
The University of Pennsylvania, scene of much 'hooking up'. (Photo: Alamy)
From Monday's Daily Telegraph:
We are nervously awaiting the 18-year-old’s A-level results. Not as nervously, however, as if he had chosen, as many of his friends have done, to try for an American university. The latest news from elite campuses across the Atlantic struck fear in our hearts: everyone is “hooking up” over there, and that’s bad.
Hook-up culture is about sexual encounters that are rushed, unemotional and brief. It sounds depressingly familiar – “wham, bam, thank you ma’am”, we called it when I was an undergraduate – but what makes hook-up culture different is its raison d’être: students today are too busy for relationships. And, unlike the no-strings sex of yesteryear, women as well as men are choosing a hook-up over proper dating.
The bleak new thinking was exposed in a New York Times investigation last week. A journalist interviewed 60 girls at the top-drawer University of Pennsylvania – and their revelations shocked middle-class moms and dads across the country. Their children feel immense pressure to get A grades and fill their CVs with extra-curricular activities, such as running the university magazine, starring in the debating society, spending the summer volunteering as an intern on Capitol Hill. There is a shortage of good jobs out there, so competition is huge on campus. No one’s got time for romance.
Instead, they text (probably after a drink or two) hook-up buddies with whom they can engage in a decompression session of sexual activity. I won’t say “sexual pleasure” as the couple spends very little time on anything but the most perfunctory of chats: think commuters on the Tube rather than Romeo and Juliet. They invest so little in one another, one interviewee confessed she always went to her hook-up’s rooms, so she wouldn’t have to bother changing the sheets.
What a difference a recession makes. In my salad days, during the boom years of the 1980s, we could afford to be far more casual about job-seeking. University, I was taught, was not a means to an end but an end in itself: a place where I could finally learn everything I wanted to know about Bismarck, the Risorgimento and the Dreyfus Affair. Grants, scholarships and no-fee tuition meant that undergraduates, even from modest backgrounds, felt that for three years, money really was immaterial. I remember being shocked that friends were going to London for job interviews in the run-up to finals: surely the BBC and the Rothschild bank could wait?
The time of plenty meant that splurging felt acceptable – emotionally as well as with government grants. University was about romance as well as books; among the more precious undergraduates, in fact, the latter served to fuel the former. We bought scented candles, agonised over which LP to set the mood (Dire Straits’ Sultans of Swing was reckoned to be the most aphrodisiac song in my first year), and even considered sprinkling rose petals on pillows… we may have been naive, and naff, but at least we thought coupling meant exchanging ideas, memories and compliments, not merely bodily fluids.

Studies show that the average number of hook-ups in the US last year worked out to two per student. That’s cheered up a lot of my male friends, on both sides of the Atlantic: apart from their children having heartless sex,
the biggest fear fathers have is that their children are having much more sex than they did.
My worry, though, is for the young men and women who graduate from hooking up only to discover that they lack the necessary skills for a proper relationship. Hook-ups teach that love is a distraction; for most of us, though, it’s the main event. Even in a recession, kids.

Sunday 19 May 2013

Daniel Dennett's seven tools for thinking



Cognitive scientist and philosopher Daniel Dennett is one of America's foremost thinkers. In this extract from his new book, he reveals some of the lessons life has taught him
dennett
Daniel Dennett: 'Often the word "surely" is as good as a blinking light locating a weak point in the argument.' Photograph: Peter Yang/August

1 USE YOUR MISTAKES

We have all heard the forlorn refrain: "Well, it seemed like a good idea at the time!" This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say: "Well, it seemed like a good idea at the time!" is standing on the threshold of brilliance. We human beings pride ourselves on our intelligence, and one of its hallmarks is that we can remember our previous thinking and reflect on it – on how it seemed, on why it was tempting in the first place and then about what went wrong.
  1. Intuition Pumps and Other Tools for Thinking
  2. by Daniel C Dennett
  1. Tell us what you think: Rate the book
I know of no evidence to suggest that any other species on the planet can actually think this thought. If they could, they would be almost as smart as we are. So when you make a mistake, you should learn to take a deep breath, grit your teeth and then examine your own recollections of the mistake as ruthlessly and as dispassionately as you can manage. It's not easy. The natural human reaction to making a mistake is embarrassment and anger (we are never angrier than when we are angry at ourselves) and you have to work hard to overcome these emotional reactions.
Try to acquire the weird practice of savouring your mistakes, delighting in uncovering the strange quirks that led you astray. Then, once you have sucked out all the goodness to be gained from having made them, you can cheerfully set them behind you and go on to the next big opportunity. But that is not enough: you should actively seek out opportunities just so you can then recover from them.
In science, you make your mistakes in public. You show them off so that everybody can learn from them. This way, you get the benefit of everybody else's experience, and not just your own idiosyncratic path through the space of mistakes. (Physicist Wolfgang Pauli famously expressed his contempt for the work of a colleague as "not even wrong". A clear falsehood shared with critics is better than vague mush.)
This, by the way, is another reason why we humans are so much smarter than every other species. It is not so much that our brains are bigger or more powerful, or even that we have the knack of reflecting on our own past errors, but that we share the benefits our individual brains have won by their individual histories of trial and error.
I am amazed at how many really smart people don't understand that you can make big mistakes in public and emerge none the worse for it. I know distinguished researchers who will go to preposterous lengths to avoid having to acknowledge that they were wrong about something. Actually, people love it when somebody admits to making a mistake. All kinds of people love pointing out mistakes.
Generous-spirited people appreciate your giving them the opportunity to help, and acknowledging it when they succeed in helping you; mean-spirited people enjoy showing you up. Let them! Either way we all win.

RESPECT YOUR OPPONENT

Just how charitable are you supposed to be when criticising the views of an opponent? If there are obvious contradictions in the opponent's case, then you should point them out, forcefully. If there are somewhat hidden contradictions, you should carefully expose them to view – and then dump on them. But the search for hidden contradictions often crosses the line into nitpicking, sea-lawyering and outright parody. The thrill of the chase and the conviction that your opponent has to be harbouring a confusion somewhere encourages uncharitable interpretation, which gives you an easy target to attack.
But such easy targets are typically irrelevant to the real issues at stake and simply waste everybody's time and patience, even if they give amusement to your supporters. The best antidote I know for this tendency to caricature one's opponent is a list of rules promulgated many years ago by social psychologist and game theorist Anatol Rapoport.
How to compose a successful critical commentary:
1. Attempt to re-express your target's position so clearly, vividly and fairly that your target says: "Thanks, I wish I'd thought of putting it that way."
2. List any points of agreement (especially if they are not matters of general or widespread agreement).
3. Mention anything you have learned from your target.
4. Only then are you permitted to say so much as a word of rebuttal or criticism.
One immediate effect of following these rules is that your targets will be a receptive audience for your criticism: you have already shown that you understand their positions as well as they do, and have demonstrated good judgment (you agree with them on some important matters and have even been persuaded by something they said). Following Rapoport's rules is always, for me, something of a struggle…

THE "SURELY" KLAXON

When you're reading or skimming argumentative essays, especially by philosophers, here is a quick trick that may save you much time and effort, especially in this age of simple searching by computer: look for "surely" in the document and check each occurrence. Not always, not even most of the time, but often the word "surely" is as good as a blinking light locating a weak point in the argument.
Why? Because it marks the very edge of what the author is actually sure about and hopes readers will also be sure about. (If the author were really sure all the readers would agree, it wouldn't be worth mentioning.) Being at the edge, the author has had to make a judgment call about whether or not to attempt to demonstrate the point at issue, or provide evidence for it, and – because life is short – has decided in favour of bald assertion, with the presumably well-grounded anticipation of agreement. Just the sort of place to find an ill-examined "truism" that isn't true!

ANSWER RHETORICAL QUESTIONS

Just as you should keep a sharp eye out for "surely", you should develop a sensitivity for rhetorical questions in any argument or polemic. Why? Because, like the use of "surely", they represent an author's eagerness to take a short cut. A rhetorical question has a question mark at the end, but it is not meant to be answered. That is, the author doesn't bother waiting for you to answer since the answer is so obvious that you'd be embarrassed to say it!
Here is a good habit to develop: whenever you see a rhetorical question, try – silently, to yourself – to give it an unobvious answer. If you find a good one, surprise your interlocutor by answering the question. I remember a Peanuts cartoon from years ago that nicely illustrates the tactic. Charlie Brown had just asked, rhetorically: "Who's to say what is right and wrong here?" and Lucy responded, in the next panel: "I will."

EMPLOY OCCAM'S RAZOR

Attributed to William of Ockham (or Ooccam), a 14th-century English logician and philosopher, this thinking tool is actually a much older rule of thumb. A Latin name for it is lex parsimoniae, the law of parsimony. It is usually put into English as the maxim "Do not multiply entities beyond necessity".
The idea is straightforward: don't concoct a complicated, extravagant theory if you've got a simpler one (containing fewer ingredients, fewer entities) that handles the phenomenon just as well. If exposure to extremely cold air can account for all the symptoms of frostbite, don't postulate unobserved "snow germs" or "Arctic microbes". Kepler's laws explain the orbits of the planets; we have no need to hypothesise pilots guiding the planets from control panels hidden under the surface. This much is uncontroversial, but extensions of the principle have not always met with agreement.
One of the least impressive attempts to apply Occam's razor to a gnarly problem is the claim (and provoked counterclaims) that postulating a God as creator of the universe is simpler, more parsimonious, than the alternatives. How could postulating something supernatural and incomprehensible be parsimonious? It strikes me as the height of extravagance, but perhaps there are clever ways of rebutting that suggestion.
I don't want to argue about it; Occam's razor is, after all, just a rule of thumb, a frequently useful suggestion. The prospect of turning it into a metaphysical principle or fundamental requirement of rationality that could bear the weight of proving or disproving the existence of God in one fell swoop is simply ludicrous. It would be like trying to disprove a theorem of quantum mechanics by showing that it contradicted the axiom "Don't put all your eggs in one basket".

DON'T WASTE YOUR TIME ON RUBBISH

Sturgeon's law is usually expressed thus: 90% of everything is crap. So 90% of experiments in molecular biology, 90% of poetry, 90% of philosophy books, 90% of peer-reviewed articles in mathematics – and so forth – is crap. Is that true? Well, maybe it's an exaggeration, but let's agree that there is a lot of mediocre work done in every field. (Some curmudgeons say it's more like 99%, but let's not get into that game.)
A good moral to draw from this observation is that when you want to criticise a field, a genre, a discipline, an art form …don't waste your time and ours hooting at the crap! Go after the good stuff or leave it alone. This advice is often ignored by ideologues intent on destroying the reputation of analytic philosophy, sociology, cultural anthropology, macroeconomics, plastic surgery, improvisational theatre, television sitcoms, philosophical theology, massage therapy, you name it.
Let's stipulate at the outset that there is a great deal of deplorable, second-rate stuff out there, of all sorts. Now, in order not to waste your time and try our patience, make sure you concentrate on the best stuff you can find, the flagship examples extolled by the leaders of the field, the prize-winning entries, not the dregs. Notice that this is closely related to Rapoport's rules: unless you are a comedian whose main purpose is to make people laugh at ludicrous buffoonery, spare us the caricature.

BEWARE OF DEEPITIES

A deepity (a term coined by the daughter of my late friend, computer scientist Joseph Weizenbaum) is a proposition that seems both important and true – and profound – but that achieves this effect by being ambiguous. On one reading, it is manifestly false, but it would be earth-shaking if it were true; on the other reading, it is true but trivial. The unwary listener picks up the glimmer of truth from the second reading, and the devastating importance from the first reading, and thinks, Wow! That's a deepity.
Here is an example (better sit down: this is heavy stuff): Love is just a word.
Oh wow! Cosmic. Mind-blowing, right? Wrong. On one reading, it is manifestly false. I'm not sure what love is – maybe an emotion or emotional attachment, maybe an interpersonal relationship, maybe the highest state a human mind can achieve – but we all know it isn't a word. You can't find love in the dictionary!
We can bring out the other reading by availing ourselves of a convention philosophers care mightily about: when we talk about a word, we put it in quotation marks, thus: "love" is just a word. "Cheeseburger" is just a word. "Word" is just a word. But this isn't fair, you say. Whoever said that love is just a word meant something else, surely. No doubt, but they didn't say it.
Not all deepities are quite so easily analysed. Richard Dawkins recently alerted me to a fine deepity by Rowan Williams, the then archbishop of Canterbury, who described his faith as "a silent waiting on the truth, pure sitting and breathing in the presence of the question mark".
I leave the analysis of this as an exercise for you.

Saturday 27 April 2013

Want to boost the economy? Ban all meetings



David Cameron has had the cabinet table extended so more spads can fit around it. Wave goodbye to productivity at No 10
Jas illo for Marina Hyde
'Nobody really believes cabinet meetings affect anything. Their sole impact on the economy is driving sales of those plastic document folders that hide the text beneath them.' Illustration by Jas
If you want a sense of just how big David Cameron and his ideas are, then know this: a carpenter was recently ordered to build an extension to the cabinet table. A piece of furniture that has seen governments through for more than half a century has now been made even bigger, the better to accommodate the increasing number of people who don't make decisions around it.
To the annals of things that sound like rejected Thick of It plotlines, then, let us add the cabinet table thing. (One troubling irony of The Thick of It's success is what a crutch of Westminster life the show has become. The sheer volume of defeatist politicos who now explain away their days by saying "It was like an episode of The Thick of It" should really be satirised by an episode of The Thick of It.)
Anyway, a cabinet maker – appropriately – really has created a 4ft table extension to make room for all the extra ministers given attendance privileges, and all the special advisers and press officers and other bods who pitch up at 9am every Tuesday looking like they've won a competition to attend a cabinet meeting. (Second prize, to give the old joke a run-out, is attending two.)
It's going to be agony waiting out the 30-year rule to discover what someone's spad said about something that had been decided by some other people somewhere else some other time – but in the interim, I hope No 10 will embrace further advances in the modern science of meetingology. They could start having cabinet off-sites and cabinet awaydays. Perhaps one of Cameron's gurus could appropriate the word iCabinet and fashion a new governance gimmick around it.
It's hardly a new point, but nobody really believes cabinet meetings affect anything. Their sole impact on the economy is driving sales of those plastic document folders that hide the text beneath them. If cabinet attendees let photographers see their cabinet notes, then they might not be allowed to come to cabinet any more.
Yet the cabinet is merely a Westminster example of the meeting malaise that increasingly grips the world. Last month an admittedly non-scientific survey claimed that the average office worker spends more than 16 hours a week in meetings. The average civil servant spends 22, with both public and private sector respondents deeming significant percentages of these meetings to be utterly pointless.
In olden times, one of the best things about print journalism was that there were scarcely any meetings at all, because it seemed to everyone there wasn't time. There was one in the morning, where people took a briskly critical look through that morning's paper before deciding what to put in tomorrow's. And then everyone went off and did it. For most of my colleagues, alas, those days are gone. My own weird job – lancing my brain with a keyboard, basically – is one of the few that requires almost no meetings at all. I count myself one of the luckiest people alive. Friends with non-media occupations often tell me that they are required to attend so many meetings that they wonder when on earth they're supposed to do their actual job. In a surreal non-productive way, the meeting has almost become the job.
Indeed, I'm told by some that the higher up you get in the world of meetings, the more stage-managed they are. Decisions aren't made there: they're just ratified. The old "information sharing" justification is apparently cobblers too, because if you have to wait till the meeting to get the information, then you're really not relevant enough to be at the meeting.
What the vast majority of meetings do is confer status on those blowhards "leading" them, or attendees who really should find other ways to validate themselves. Even Cobra – the snazzy-sounding Whitehall crisis response meeting – is widely griped about, with Scotland Yard's formerly most senior anti-terrorism officer complaining it was "cumbersome and bureaucratic", full of people "jockeying for position", and slowed everything down.
But on people go. Gazillions of meetings are held every day, with every one presumably regarded as an indispensable step toward something worth attaining. What would winning the game of meetings even look like? I suppose you'd battle up all the levels, and finally ascend to the ultimate meeting: one to which you'd actually want to go to. Maybe you'd get in on the Meeting of Meetings, which would be something like that meeting where Obama and Hillary and the joint chiefs watched Osama bin Laden's compound being stormed live. But was that really a meeting? In the photos it looked so passive as to be more like a movie night.
As a last word on meetings, I keep thinking of that radical Dutch urban planner who did away with all traffic lights in various towns, and found road safety dramatically improved. If only, instead of making fatuous interventions on some footballer's disciplinary breach Cameron did something similarly useful with his time. Imagine if he could announce that for one week – in fact, make it a month – all meetings in all workplaces in all Britain were to be banned. People would simply have to muddle through, reclaiming the civilised mores of a time before the answer to everything was to have a meeting. Who knows, a meetingless Britain might even prove that holy of holies for George Osborne – the entirely free initiative that would significantly boost the economy.

Friday 7 December 2012

What good luck to miss out on a £64m lottery win

 

The national lottery symbol
'If I fantasise about winning the lottery, it doesn’t take long before all sorts of worrisome potential consequences occur to me.' Photograph: Danny Lawson/PA
 
What would you do if you discovered that you were the owner of the lottery ticket that won £64m, but you didn't claim it in time?

It would be nice to think you were big enough simply to be pleased that you had become the country's biggest philanthropist of the year by mistake, since at 11pm on Wednesday night, the unclaimed prize was handed over to charity. But I think most of us would be haunted by thoughts of what might have been for the rest of our lives.

Perhaps, but you don't need to have just missed out on a fortune to have dreams of "maybe … ". Life is full of what-ifs, many of which could easily have been realities, had just a few things been different.

Bitter regret is the consequence of being more confident than we should be about where those alternative paths would have led us. The truth is that we will never know. What looks like good fortune can easily turn out to be an incredible stroke of bad luck and vice versa. Albert Camus got in a car going back to Paris instead of getting the train, only to be killed in a fatal road crash. Then there are the passengers who were running late the day of the London bombings of July 2005 and missed the tube or train journey that would have killed them.

If we find it hard to believe that winning millions might not be so lucky after all, we just don't have a good enough imagination. If I fantasise about winning the lottery, it doesn't take long before all sorts of worrisome potential consequences occur to me. I think about how I might spread the love, and worry that it would take away the incentive for someone to work at what might really give them satisfaction; or that they might spend the cash on things like cosmetic surgery or drugs that are no good for them in the long run. Meanwhile, of course, I trust myself to spend wisely, unaware of all the ways in which I too might screw up my life by making bad choices.

The point is not to convince ourselves that wealth brings misery, which is just an idea the rest of us cling on to make ourselves feel better. It's simply that we don't know what would happen in any given case, and so we should not mourn for an alternative future (or past), the outcome of which is mysterious.

If we really do want to turn our near-miss into a positive, then we should take it as a lesson about how fickle fate is. We often don't notice how many of the things that have gone right for us depended on chance events that could have been otherwise. If I think about my choice of university and subject, or meeting my business and life partners, things that set the course of my life, it is frightening how easily none of them could have happened at all.

My greatest consolation would come from an article I researched several years ago, in which I chased up seven members of two rock bands that had very nearly, but not quite, made the big time. For one, it probably came down to no more than a technical hiccup, meaning Simon Bates never played their single on the then biggest radio show in the country. All accepted that it would have been great to have broken through, although one was convinced it would have killed him, so seduced was he by the rock'n'roll life. But all had made their peace with their near misses and could see now that what really mattered was continuing to do what they loved. I thought I had written a great piece, but it never got used. Another "nearly" moment.

We can't control whether we are rewarded for our endeavours, with cash or recognition. It is not up to us how much cash or time we get on Earth, but it is down to us how we spend it.

Sunday 25 November 2012

Good Length and Right Speed

Will Rhodes on Good Length for a bowler

A good length is the shortest length where a batsman is obliged to play forward.

The Right Speed is when a batsman beaten by flight does not have the time to play a second shot.


Monday 22 October 2012

Ugly is the new Beautiful


REX FEATURES
view gallery VIEW GALLERY

At the launch tonight of Design Museum co-founder Stephen Bayley's new book, Ugly: the Aesthetics of Everything, guests will be served ugly canapés and ugly cocktails.

In attendance will be Mugly, an eight-year-old hairless Chinese Crested dog from Peterborough, who is the recent winner of the Ugliest Dog in the World contest, held annually in California, as well as models from the Ugly Model agency, including one woman credited with "looking like a fish".

At what is billed as London's first "ugly party", a grand café will be decked out with "ghoulish objects" and "revolting curios", including a stuffed pug giving birth to a flying pig and blown-up images from Bayley's book, including one of Myra Hindley. "My barman is working on a grey- coloured cocktail and Martinis with gherkins in them," says Bayley. "Talking about beauty is boring – when you get talking about ugliness it gets interesting."

His book Ugly explores the complexities of ugliness and makes the point that without ugliness, there would be no beauty. He has cherry-picked items for his book, including kitsch flying ducks, hideous pink-haired troll dolls – even the postmodernist architecture of the Sainsbury Wing of the National Gallery gets singled out. Ugliness is fascinating, he claims – take the repugnant The Ugly Duchess by Quentin Massys – "It's one of the most popular postcards sold in London's National Gallery shop and rivals the sales of Monet's tranquil Water-Lilies," he says.

There are also images of the Eiffel Tower and the Albert Memorial: "In 1887 leading Paris intellectuals ganged up and said the Eiffel Tower, which was being built, was a 'hateful column of bolted tin… useless and monstrous'", he says. "Now the Eiffel Tower is regarded as one of the most touching, romantic French monuments. The Albert Memorial was loathed and detested – now it is charming, delightful and evocative."

There are no chapters in Ugly, which is Bayley's sequel to Taste, published 1991; instead it's full of long paragraphs of ideas exploring ugliness – a subject not many people have written about.
"I'm not being prescriptive about what is ugly – I'm just provoking ideas about our assumptions of ugliness," says Bayley.

"I'm not looking for agreement. When we talk about design, it is this attempt to introduce beauty by the Modern movement. They told us that if things were functional they would be beautiful – but as soon as you investigate what is beauty – I would say the evidence is mixed. A bomb-dropping Boeing B-52 is extraordinarily functional, but is it beautiful even though it is morally repugnant? What about a gun?

"Our view of what is and what isn't beautiful changes over time. Maybe there are no permanent values in the world of art. It is certainly a question that needs to be asked. If the whole world was beautiful it would in fact be extremely boring. We need a measure of ugliness to understand beauty. You can only understand heaven if you have a concept of hell. "

Bayley focuses on Ernö Goldfinger's Trellick Tower in west London: "If there ever was a test for taste, it's this," he says. The tall housing block built in 1972 was listed by English Heritage in 1998. "It was deplored by many as a brutalist horror. Now half the world regards it as an eyesore – the other half regards it as heroic and uplifting. Maybe they are both are right. Any minute now Prince Charles will come to admire it. "

Gebrüder Thonet's mass-produced Model No. 14 chair (1859), the original café chair, was revered by Le Corbusier as "the ultimate in elegant design".

"I like the chair – I like clean, unfussy, undecorated things – but I don't think it's inevitably, timelessly perfect," says Bayley, who also includes an image of an Amorphophallus titanum, known as the corpse flower, which "smells of death" and looks phallic. "Can nature be ugly? Personally, I think it can," he says.

There is no end to the fascination of ugliness for Bayley, whose book opens with a photograph of a pig and then Frankenstein. He adds: "If you are talking to architecture students and you ask them to deliberately design something ugly, it is very difficult. It is very difficult to create ugliness – what we call ugly seems to be accidental."

But whether you would want Matthias Grunewald's oil painting The Isenheim Altarpiece (1516) of a man with skin disease on your wall is quite another matter. Or indeed Hieronymus Bosch's triptych The Garden of Earthly Delights (c.1490-1510) depicting Hell, and full of disfigurements and mutations.

There is an image, too, of John Constable's Windmill among Houses and Rainbow – not because it is ugly. "I want to make the point that while we are all worried about the industrialisation of the countryside, this is what Constable's idyllic scenes of the countryside were often about."

Bayley also includes gargoyles from Notre-Dame de Paris, and anti-Jewish Nazi propaganda posters, in which Jews are depicted as ugly caricatures.

One section of the book, "The problem with hair", has images of the monster in I was a Teenage Werewolf (1957), which shows, he says, "how abnormal hair retains a disturbing power".

"Firstly if you take a long view of the history of art, ideas about beauty are not permanent – and secondly, things that are ugly can be fascinating and perversely attractive" says Bayley. "No matter what your views, you couldn't read this book and not either come out lacerated, stimulated, annoyed or in total agreement with my genius. 
It's not a historical narrative but it's a collection of consistent and interesting and stimulating ideas."
'Ugly: the Aesthetics of Everything', by Stephen Bayley, is published by Goodman Fiell (£25
)

Tuesday 6 December 2011

Why Is Economic Growth So Popular?


By Ugo Bardi
26 November, 2011
Cassandra's legacy

When the new Italian Prime Minister, Mr. Mario Monti, gave his acceptance speech to the Senate, a few days ago, he used 28 times the term "growth" and not even once terms such as "natural resources" or "energy". He is not alone in neglecting the physical basis of the world's economy: the chorus of economic pundits everywhere in the world is all revolving around this magic world, "growth". But why? What is that makes this single parameter so special and so beloved?
 
During the past few years, the financial system gave to the world a clear signal when the prices of all natural commodities spiked up to levels never seen before. If prices are high, then there is a supply problem. Since most of the commodities we use are non-renewable - crude oil, for instance - it is at least reasonable to suppose that we have a depletion problem. Yet, the reaction of leaders, decision makers, and economic pundits of all kinds was - and still is - to ignore the physical basis of the economic system and promote economic growth as the solution to all our problems; the more, the better. But, if depletion is the real problem, it should be obvious that growth can only make it worse. After all, if we grow we consume more resources and that will accelerate depletion. So, why are our leaders so fixated on growth? Can't they understand that it is a colossal mistake? Are they stupid or what?

Things are not so simple, as usual. One of the most common mistakes that we can make in life is to assume that people who don't agree with our ideas are stupid. No, there holds the rule that for everything that exists, there is a reason. So, there has to be a reason why growth is touted as the universal cure for all problems. And, if we go in depth into the matter, we may find the reason in the fact that people (leaders as well as everybody else) tend to privilege short term gains to long term ones. Let me try to explain.

Let's start with observing that the world's economy is an immense, multiple-path reaction driven by the thermodynamic potentials of the natural resources it uses. Mainly, these resources are non-renewable fossil fuels that we burn in order to power the whole system. We have good models that describe the process; the earliest ones go back to the 1970s with the first version of "The Limits to Growth" study. These models are based on the method known as "system dynamics" and consider highly aggregated stocks of resources (that is, averaged over many different kinds). Already in 1972, the models showed that the gradual depletion of high grade ores and the increase of persistent pollution would cause the economy to stop growing and then decline; most likely during the first decades of the 21st century. Later studies of the same kind generated similar results. The present crisis seems to vindicate these predictions.

So, these models tell us that depletion and pollution are at the root of the problems we have, but they tell us little about the financial turmoil that we are seeing. They don't contain a stock called "money" and they make no attempt to describe how the crisis will affect different regions of the world and different social categories. Given the nature of the problem, that is the only possible choice to make modelling manageable, but it is also a limitation. The models can't tell us, for instance, how policy makers should act in order to avoid the bankruptcy of entire states. However, the models can be understood in the context of the forces that move the system. The fact that the world's economic system is complex doesn't mean that it doesn't follow the laws of physics. On the contrary, it is by looking at these laws that we can gain insight on what's happening and how we could act on the system.

There are good reasons based in thermodynamics that cause economies to consume resources at the fastest possible rate and at the highest possible efficiency (see this paper by Arto Annila and Stanley Salthe). So, the industrial system will try to exploit first the resources which provide the largest return. For energy producing resources (such as crude oil) the return can be measured in terms of energy return for energy invested (EROEI). Actually, decisions within the system are taken not in terms of energy but in terms of monetary profit, but the two concepts can be considered to coincide as a first approximation. Now, what happens as non-renewable resources are consumed is that the EROEI of what is left dwindles and the system becomes less efficient; that is, profits go down. The economy tends to shrink while the system tries to concentrate the flow of resources where they can be processed at the highest degree of efficiency and provide the highest profits; something that usually is related to economies of scale. In practice, the contraction of the economy is not the same everywhere: peripheral sections of the system, both in geographical and social terms, cannot process resources with sufficient efficiency; they tend to be cut off from the resource flow, shrink, and eventually disappear. An economic system facing a reduction in the inflow of natural resources is like a man dying of cold: extremities are the first to freeze and die off.
Then, what's the role of the financial system - aka, simply "money"? Money is not a physical entity, it is not a natural resource. It has, however, a fundamental role in the system as a catalyst. In a chemical reaction, a catalyst doesn't change the chemical potentials that drive the reaction, but it can speed it up and change the preferred pathway of the reactants. For the economic system, money doesn't change the availability of resources or their energy yield but it can direct the flow of natural resources to the areas where they are exploited faster and most efficiently. This allocation of the flow usually generates more money and, therefore, we have a typical positive (or "enhancing") feedback. As a result, all the effects described before go faster. Depletion can be can be temporarily masked although, usually, at the expense of more pollution. Then, we may see the abrupt collapse of entire regions as it may be the case of Spain, Italy, Greece and others. This effect can spread to other regions as the depletion of non renewable resources continues and the cost of pollution increases.

We can't go against thermodynamics, but we could at least avoid some of the most unpleasant effects that come from attempting to overcome the limits to the natural resources. This point was examined already in 1972 by the authors of the first "Limits to Growth" study on the basis of their models but, eventually, it is just a question of common sense. To avoid, or at least mitigate collapse, we must stop growth; in this way non renewable resources will last longer and we can use them to develop and use renewable resources. The problem is that curbing growth does not provide profits and that, at present, renewables don't yet provide profits as large as those of the remaining fossil fuels. So, the system doesn't like to go in that direction - it tends, rather, to go towards the highest short term yields, with the financial system easing the way. That is, the system tends to keep using non renewable resources, even at the cost of destroying itself. Forcing the system to change direction could be obtained only by means of some centralized control but that, obviously, is complex, expensive, and unpopular. No wonder that our leaders don't seem to be enthusiastic about this strategy.

Let's see, instead, another possible option for leaders: that of "stimulating growth". What does that mean, exactly? In general, it seems to mean to use the taxation system to transfer financial resources to the industrial system. With more money, industries can afford higher prices for natural resources. As a consequence, the extractive industry can maintain its profits, actually increase them, and keep extracting even from expensive resources. But money, as we said, is not a physical entity; in this case it only catalyzes the transfer human and material resources to the extractive system at the expense of subsystems as social security, health care, instruction, etc. That's not painless, of course, but it may give to the public the impression that the problems are being solved. It may improve economic indicators and it may keep resource flows large enough to prevent the complete collapse of peripheral regions, at least for a while. But the real attraction of stimulating growth is that it is the easy way: it pushes the system in the direction where it wants to go. The system is geared to exploit natural resources at the fastest possible rate, this strategy gives it fresh resources to do exactly that. Our leaders may not understand exactly what they are doing, but surely they are not stupid - they are not going against the grain.

The problem is that the growth stimulating strategy only buys time (and buys it at a high price). Nothing that governments or financial traders do can change the thermodynamics of the world system - all what they can do is to shuffle resources from here to there and that doesn't change the hard reality of depletion and pollution. So, pushing economic growth is only a short term solution that worsens the problem in the long run. It can postpone collapse but at the price of making it more abrupt in the form known as the Seneca Cliff. Unfortunately, it seems that we are headed exactly that way.

[This post was inspired by an excellent post on the financial situation written by Antonio Turiel with the title "Before the Wave" (in Spanish). ]

Ugo Bardi is a professor of Chemistry at the Department of Chemistry of the University of Firenze, Italy. He also has a more general interest in energy question and is the founder and president of ASPO Italia.
 

Thursday 27 October 2011

Is modern science Biblical or Greek?


By Spengler

The "founders of modern science", writes David Curzon in Jewish Ideas Daily [1] of October 18, "were all believers in the truths of the opening chapter in the Hebrew Bible. The belief implicit in Genesis, that nature was created by a law-giving God and so must be governed by "laws of nature," played a necessary role in the emergence of modern science in 17th-century Europe. Equally necessary was the belief that human beings are made in the image of God and, as a consequence, can understand these "laws of nature."

Curzon argues that the modern idea of "laws of nature" stems from the Bible rather than classical Greece, for "ancient Greeks certainly believed that nature was intelligible and that its regularities could be made explicit. But Greek gods such as Zeus were not understood to have created the processes of nature; therefore, they could not have given the laws governing these processes."

Is this just a matter of semantics? Is there a difference between the "Greek" concept of intelligibility, and what Curzon calls the biblical concept of laws of nature? After all, the achievements of Greek science remain a monument to the human spirit. The Greek geometer Eratosthenesin the third century BCE calculated the tilt of the earth's axis, the circumference of the earth, and (possibly) the earth's distance from the sun. Archimedes used converging infinite series to calculate the area of conic sections, approximating the calculus that Newton and Leibniz discovered in the 17th century.

An enormous leap of mind, though, separates Archimedes' approximations from the new mathematics of the 17th century, which opened a path to achievements undreamed of by the Greeks. Something changed in the way that the moderns thought about nature. But does the rubric "laws of nature" explain that change? Curzon is on to something, but the biblical roots of modern science go much deeper.

Before turning to the scientific issues as such, it is helpful to think about the differences in the way Greeks and Hebrews saw the world. The literary theorist Erich Auerbach famously contrasted Greek and Hebrew modes of thought [2] by comparing two stories: the binding of Isaac in Genesis 22, and the story of Odysseus' scar told in flashback (Odyssey, Book 19).

Homer's hero has returned incognito to his home on the island of Ithaca, fearful that prospective usurpers will murder him. An elderly serving woman washes his feet and sees a scar he had received on a boar hunt two decades earlier, before leaving for the Trojan War, and recognizes him. Homer then provides a detailed account of the boar hunt before returning to his narrative.

Homer seeks to bring all to the surface, Auerbach explained in his classic essay. "The separate elements of a phenomenon are most clearly placed in relation to one another; a large number of conjunctions, adverbs, particles, and other syntactical tools, all clearly circumscribed and delicately differentiated in meaning, delimit persons, things, and portions of incidents in respect to one another, and at the same time bring them together in a continuous and ever flexible connection; like the separate phenomena themselves, their relationships - their temporal, local, causal, final, consecutive, comparative, concessive, antithetical, and conditional limitations - are brought to light in perfect fullness; so that a continuous rhythmic procession of phenomena passes by, and never is there a form left fragmentary or half-illuminated, never a lacuna, never a gap, never a glimpse of unplumbed depths."

Auerbach adds, "And this procession of phenomena takes place in the foreground - that is, in a local and temporal present which is absolute. One might think that the many interpolations, the frequent moving back and forth, would create a sort of perspective in time and place; but the Homeric style never gives any such impression."

Stark and spare, by contrast, is the story of God's summons to Abraham to sacrifice his beloved son Isaac. Where Homer tells us everything, the Bible tells us very little. God speaks to Abraham, and Abraham says, "Here I am." Auerbach observes, "Where are the two speakers? We are not told. The reader, however, knows that they are not normally to be found together in one place on earth, that one of them, God, in order to speak to Abraham, must come from somewhere, must enter the earthly realm from some unknown heights or depths. Whence does he come, whence does he call to Abraham? We are not told."

Abraham and Isaac travel together. Auerbach writes, "Thus the journey is like a silent progress through the indeterminate and the contingent, a holding of the breath, a process which has no present, which is inserted, like a blank duration, between what has passed and what lies ahead, and which yet is measured: three days!" Auerbach concludes:
On the one hand, externalized, uniformly illuminated phenomena, at a definite time and in a definite place, connected together without lacunae in a perpetual foreground; thoughts and feeling completely expressed; events taking place in leisurely fashion and with very little of suspense. On the other hand, the externalization of only so much of the phenomena as is necessary for the purpose of the narrative, all else left in obscurity; the decisive points of the narrative alone are emphasized, what lies between is nonexistent; time and place are undefined and call for interpretation; thoughts and feeling remain unexpressed, are only suggested by the silence and the fragmentary speeches; the whole, permeated with the most unrelieved suspense and directed toward a single goal (and to that extent far more of a unity), remains mysterious and "fraught with background".
Literary analysis may seem an unlikely starting-point for a discussion of science. But the Hebrew Bible's embodiment of what Auerbach called "the indeterminate and the contingent" has everything to do with the spirit of modern science. This emerges most vividly in the difference between the Greek and Hebrew understanding of time, the medium through which we consider infinity and eternity.

What separates Archimedes' approximation from Leibniz' calculus? The answer lies in the concept of infinity itself. Infinity was a stumbling-block for the Greeks, for the concept was alien to what Auerbach called their "perpetual foreground." Aristotle taught that whatever was in the mind was first in the senses. But by definition infinity is impossible to perceive. In the very large, we can never finish counting it; in the very small (for example infinitely diminishing quantities), we cannot perceive it. Infinity and eternity are inseparable concepts, for we think of infinity as a count that never ends.

For the Greeks, time is merely the demarcation of events. Plato understands time as an effect of celestial mechanics in Timaeus, while Aristotle in the Physics thinks of time as nothing more than the faucet-drip of events. That is Homer's time, in Auerbach's account. Biblical time is an enigma. That is implicit in Genesis, as Auerbach notes, but explicit in the Book of Ecclesiastes. Greek time is an "absolute temporal present."

In Hebrew time, it is the moment itself that remains imperceptible. Here is Ecclesiastes 3:15 in the Koren translation (by the 19th-century rabbi Michael Friedländer): "That which is, already has been; and that which is to be has already been; and only God can find the fleeting moment." As I wrote in another context, [3] Rabbi Friedländer's translation probably drew upon the celebrated wager that Faust offered the Devil in Goethe's drama. Faust would lose his soul will if he attempted to hold on to the passing moment, that is, to try to grasp what only God can find. The impulse to grab the moment and hold onto it is idolatrous; it is an attempt to cheat eternity, to make ourselves into gods.

A red thread connects the biblical notion of time to modern science, and it is spun by St Augustine of Hippo, the 4th-century Church father and polymath. His reflection on time as relative rather than absolute appears in Book 11 of his Confessions. And his speculation on the nature of number in time takes us eventually to the modern conceptual world of Leibniz and the calculus Aristotle's description of time as a sequence of moments, in Augustine's view, leads to absurdities.

To consider durations in time, we must measure what is past, for the moment as such has no duration. Events that have passed no longer exist, which means that measuring past time is an attempt to measure something that is not there at all. Augustine argues instead that we measure the memory of past events rather than the past itself: ''It is in you, my mind, that I measure times,'' he writes. Our perception of past events thus depends on memory, and our thoughts about future events depend on expectation. Memory and expectation are linked by ''consideration.'' For ''the mind expects, it considers, it remembers; so that which it expects, through that which it considers, passes into that which it remembers.''

Time is not independent of the intellect in Augustine's reading. Expectation and memory, Augustine adds, determine our perception of distant past and future: ''It is not then future time that is long, for as yet it is not: But a long future, is 'a long expectation of the future,' nor is it time past, which now is not, that is long; but a long past is 'a long memory of the past.''' This is the insight that allows Augustine to link perception of time to the remembrance of revelation and the expectation of redemption.

A glimpse of what Augustine's theory of time implies for mathematics appears in his later book, Six Books on Music. I argued in a 2009 essay for First Things: [4]
In De Musica, Augustine seeks to portray ''consideration'' as a form of musical number, that is, numeri judiciales, ''numbers of judgment.'' These ''numbers of judgment'' bridge eternity and mortal time; they are eternal in character and lie outside of rhythm itself, but act as an ordering principle for all other rhythms. They stand at the head of a hierarchy of numbers that begins with ''sounding rhythms'' - the sounds as such - which are in turn inferior to ''memorized rhythms.''

Only the ''numbers of judgment'' are immortal, for the others pass away instantly as they sound, or fade gradually from memory over time. They are, moreover, a gift from God, for ''from where should we believe that the soul is given what is eternal and unchangeable, if not from the one, eternal, and unchangeable God?'' For that reason the ''numbers of judgment,'' by which the lower-order rhythms are ordered, do not exist in time but order time itself and are superior in beauty; without them there could be no perception of time. Memory and expectation are linked by the ''numbers of judgment,'' which themselves stand outside of time, are eternal, and come from God.
That is an intimation of a higher order of number. Because it is buried in a treatise on musical time, Augustine's idea about "numbers of judgment" has elicited scant scholarly interest. But it is clear that his "numbers of judgment" are consistent with his much-discussed theory of "divine illumination." He wrote in Confessions, "The mind needs to be enlightened by light from outside itself, so that it can participate in truth, because it is not itself the nature of truth. You will light my lamp, Lord."

Descartes' "innate ideas" and Kant's "synthetic reason" descend from Augustine, although Kant recast the concept in terms of hard-wiring of the brain rather than divine assistance. The founder of neo-Kantian philosophy, Hermann Cohen (1842-1918) built his career out of the insight that the fact that infinitesimals in the calculus add up to a definite sum proves the existence of something like synthetic reason. That is why Kant triumphed in philosophy and the Aristotelians were reduced to a grumpy band of exiled irredentists.

Augustine's idea finds its way into modern science through Cardinal Nicholas of Cusa (1401-1464). Theologian and mathematician, Cusa noticed that musicians were tuning their instruments to ratios that corresponded to irrational numbers. The "natural" intervals of music tuning clashed with the new counterpoint of the Renaissance, so the musicians adjusted (or "tempered") the intervals to fit their requirements.

The Greeks abhorred the notion of irrational number because they abhorred infinity. Aristotle understood that infinity lurked in the irrational numbers, for we can come infinitely close to an irrational number through an infinite series of approximations, but never quite get there. And the notion of an "actual infinity" offended the Greek notion of intelligibility. To medieval mathematicians, the irrationals were surds, or ''deaf'' numbers, that is, numbers that could not be heard in audible harmonic ratios. The association of rational numbers with musical tones was embedded so firmly in medieval thinking that the existence of an irrational harmonic number was unthinkable.

The practice of musicians, Cusa argued, overthrew Aristotle's objections. The human mind, Cusa argued, could not perceive such numbers through reason (ratio), ie the measuring and categorizing faculty of the mind, but only through the intellect (intellectus), which depended on participation (participatio) in the Mind of God.

Cusa's use of Augustinian terminology to describe the irrationals - numbers ''too simple for our mind to understand'' - heralded a problem that took four centuries to solve (and, according to the few remaining "Aristotelian realists," remains unsolved to this day).

Not until the 19th century did mathematicians arrive at a rigorous definition of irrational number, as the limit of an infinite converging sequence of rational numbers. That is simple, but our mind cannot understand it directly. Sense-perception fails us; instead, we require an intellectual leap to the seemingly paradoxical concept of a convergent infinite series of rational numbers whose limit is an irrational number.

The irrational numbers thus lead us out of the mathematics of sense-perception, the world of Euclid and Aristotle, into the higher mathematics foreshadowed by Augustine (see my article, ''Nicholas of Cusa's Contribution to Music Theory,'' in RivistaInternazionale di Musica Sacra, Vol 10, July-December 1989).

Once irrational numbers had forced their way into Western thinking, the agenda had changed. Professor Peter Pesic [5] recently published an excellent account of the impact of irrational numbers in musical tuning on mathematics and philosophy. [6]

Another two centuries passed before Leibniz averred, ''I am so in favor of the actual infinite that instead of admitting that nature abhors it, as is commonly said, I hold that nature makes frequent use of it everywhere, in order to show more effectively the perfections of its author.'' Theological concerns, one might add, also motivated Leibniz' work, as I sought to show in ''The God of the Mathematicians'' (First Things, August-September 2010).

Unlike Archimedes, who still thought in terms of approximations using rational numbers, Leibniz believed that he had discovered a new kind of calculation that embodied the infinite. Leibniz' infinitesimals (as I reported in ''God and the Mathematicians'') lead us eventually to George Cantor's discovery of different orders of infinity and the transfinite numbers that designate them; Cantor cited Cusa as well as Leibniz as his antecedents, explaining ''Transfinite integers themselves are, in a certain sense, new irrationalities. Indeed, in my opinion, the method for the definition of finite irrational numbers is quite analogous, I can say, is the same one as my method for introducing transfinite integers. It can be certainly said: transfinite integers stand and fall together with finite irrational numbers.''

Gilles DeLeuze (in Leibniz and the Baroque) reports that Leibniz ''took up in detail'' Cusa's idea of ''the most simple'' number: ''The question of harmonic unity becomes that of the 'most simple' number, as Nicolas of Cusa states, for whom the number is irrational. But, although Leibniz also happens to relate the irrational to the existent, or to consider the irrational as a number of the existent, he feels he can discover an infinite series of rationals enveloped or hidden in the incommensurable.'' Leibniz thus stands between Cusa in the fifteenth century and the flowering of the mathematics of infinite series in the nineteenth century. That is a triumph of the biblical viewpoint in modern science.

We can thus draw a red line from the Hebrew Bible (most clearly from Ecclesiastes) to Augustine, and through Nicholas of Cusa to G W Leibniz and the higher mathematics and physics of the modern world. The Hebrew Bible remains a force in modern science, despite the best efforts of rationalists and materialists to send it into exile.

Kurt Goedel, perhaps the greatest mathematician of the 20th century, approached all his work with the conviction that no adequate account of nature was possible without the presence of God. Inspired by Leibniz, Goedel destroyed all hope of a mechanistic ontology through his two Incompleteness Theorems as well as his work (with Paul Cohen) on the undecidability of the Continuum Hypothesis, as I reported in a recent First Things essay. [7]

There is always a temptation to offer simple homilies in honor of the Bible, for example, "intelligent design" theory, which in my view tells us nothing of real importance. An atheist like Spinoza also would contend that God designed the world, because in his philosophy God is the same thing as nature. Design contains no information about the unique and personal God of the Bible.

Curzon's discussion of the laws of nature is by no means wrong, but it would be wrong to leave the matter there. "The fear of God is the beginning of wisdom." As Ecclesiastes (3:11) said, "I have observed the task which God has given the sons of man to be concerned with: He made everything beautiful in its time; He also put an enigma [sometimes "eternity"] into their minds so that man cannot comprehend what God has done from beginning to end" (Ecclesiastes 3:11, Artscroll translation). Eternity is in our minds but the whole of creation is hidden from us. Steven Hawking has gone so far as to conjecture that something like Goedel's Incompleteness Principle might apply to physics as well as mathematics.

What divides Hebrews from Greeks, above all, is a sense of wonder at the infinitude of creation and human limitation. The Odyssey is intended to be heard and enjoyed; Genesis 22 is to be searched and searched again for layers of meaning that are withheld from the surface. The Greek gods were like men, only stronger, better-looking and longer lived, immortal but not eternal, and the Greeks emulated them by seeking become masters of a nature infested by gods. The Hebrews sought to be a junior partner in the unending work of creation. With due honor to the great achievements of the Greeks, modernity began at Mount Sinai.