Search This Blog

Showing posts with label myth. Show all posts
Showing posts with label myth. Show all posts

Saturday, 13 April 2024

The myth of the second chance

Janan Ganesh in The FT


In the novels of Ian McEwan, a pattern recurs. The main character makes a mistake — just one — which then hangs over them forever. A girl misidentifies a rapist, and in doing so shatters three lives, including her own (Atonement). A man exchanges a lingering glance with another, who becomes a tenacious stalker (Enduring Love). A just-married couple fail to have sex, or rather have it badly, and aren’t themselves again, either as individuals or as a pair (On Chesil Beach). Often, the mistake reverberates over much of the 20th century.  

This plot trick is said to be unbecoming of a serious artist. McEwan is accused of an obsession with incident that isn’t true to the gradualism and untidiness of real life. Whereas Proust luxuriates in the slow accretion of human experience, McEwan homes in on the singular event. It is too neat. It is written to be filmed. 

Well, I am old enough now to observe peers in their middle years, including some disappointed and hurt ones. I suggest it is McEwan who gets life right. The surprise of middle age, and the terror of it, is how much of a person’s fate can boil down to one misjudgement.  

Such as? What in particular should the young know? If you marry badly — or marry at all, when it isn’t for you — don’t assume the damage is recoverable. If you make the wrong career choice, and realise it as early as age 30, don’t count on a way back. Even the decision to go down a science track at school, when the humanities turn out to be your bag, can mangle a life. None of these errors need consign a person to eternal and acute distress. But life is path-dependent: each mistake narrows the next round of choices. A big one, or just an early one, can foreclose all hope of the life you wanted. 

There should be more candour about this from the people who are looked to (and paid) for guidance. The rise of the advice-industrial complex — the self-help podcasts, the chief executive coaches, the men’s conferences — has been mostly benign. But much of the content is American, and reflects the optimism of that country. The notion of an unsalvageable mistake is almost transgressive in the land of second chances.  

Also, for obvious commercial reasons, the audience has to be told that all is not lost, that life is still theirs to shape deep into adulthood. No one is signing up to the Ganesh Motivational Bootcamp (“You had kids without thinking it through? It’s over, son”) however radiant the speaker. 

A mistake, in the modern telling, is not a mistake but a chance to “grow”, to form “resilience”. It is a mere bridge towards ultimate success. And in most cases, quite so. But a person’s life at 40 isn’t the sum of most decisions. It is skewed by a disproportionately important few: sometimes professional, often romantic. Get these wrong, and the scope for retrieving the situation is, if not zero, then overblown by a culture that struggles to impart bad news.  

Martin Amis, that peer of McEwan’s, once attempted an explanation of the vast international appeal of football. “It’s the only sport which is usually decided by one goal,” he theorised, “so the pressure on the moment is more intense in football than any other sport.” His point is borne out across Europe most weekends. A team hogs the ball, creates superior chances, wins more duels — and loses the game to one error. It is, as the statisticians say, a “stupid” sport.   

But it is also the one that most approximates life outside the stadium. I am now roughly midway through that other low-scoring game. Looking around at the distress and regret of some peers, I feel sympathy, but also amazement at the casualness with which people entered into big life choices. Perhaps this is what happens when ideas of redemption and resurrection — the ultimate second chance — are encoded into the historic faith of a culture. It takes a more profane cast of mind to see through it.

Sunday, 7 April 2024

Never meet your hero!

Nadeem F Paracha in The Dawn

The German writer J Wolfgang Goethe once quipped, “Blessed is the nation that doesn’t need heroes.” As if to expand upon Goethe’s words, the British philosopher Herbert Spencer wrote, “Hero-worship is strongest where there is least regard for human freedom.”

There is every likelihood that Goethe was viewing societies as collectives, in which self-interest was the primary motivation but where the creation and worship of ‘heroes’ are acts to make people feel virtuous.

Heroes can’t become heroes without an audience. A segment of the society exhibits an individual and explains his or her actions or traits as ‘heroic’. If these receive enough applause, a hero is created. But then no one is really interested in knowing the actual person who has been turned into a hero. Only his mythologised sides are to be viewed.

The mythologising is done to quench a yearning in society — a yearning that cannot be fulfilled because it might be too impractical, utopian, irrational and, therefore, against self-interest. So, the mythologised individual becomes an alter ego of a society conscious of its inherent flaws. Great effort is thus invested in hiding the actual from the gaze of society, so that only the mythologised can be viewed.

One often comes across videos on social media of common everyday people doing virtuous deeds, such as helping an old person cross a busy road, or helping an animal. The helping hands in this regard are exhibited as ‘heroes’, even though they might not even be aware that they are being filmed.

What if they weren’t? What if they remain unaware about the applause that their ‘viral video’ has attracted? Will they stop being helpful without having an audience? They certainly won’t be hailed as heroes. They are often exhibited as heroes by those who want to use them to signal their own appreciative attitude towards ‘goodness’.

This is a harmless ploy. But since self-interest is rampant in almost every society, this can push some people to mould themselves as heroes. There have been cases in which men and women have actually staged certain ‘heroic’ acts, filmed them, and then put them out for all to view. The purpose is to generate praise and accolades for themselves and, when possible, even monetary gains.

But it is also possible that they truly want to be seen as heroes in an unheroic age, despite displaying forged heroism. Then there are those who are so smitten by the romanticised notions of a ‘heroic age’ that they actually plunge into real-life scenarios to quench their intense yearning to be seen as heroes.

For example, a person who voluntarily sticks his neck out for a cause that may lead to his arrest. He knows this. But he also knows that there will be many on social and electronic media who will begin to portray him as a hero. But the applauders often do this to signal their own disposition towards a ‘heroic’ cause.

We apparently live in an unheroic age — an age that philosophers such as Søren Kierkegaard, Friedrich Nietzsche or, for that matter, Muhammad Iqbal, detested. Each had their own understanding of a bygone heroic age.

To Nietzsche, the heroic age existed in some pre-modern period in history, when the Germanic people were fearless. To Iqbal, the heroic age was when early Muslims were powered by an unadulterated faith and passion to conquer the world. There are multiple periods in time that are referred to as ‘heroic ages’, depending on one’s favourite ideology or professed faith.

The yearning for heroes and the penchant for creating them to be revered — so that societies can feel better about themselves — is as old as when the first major civilisations began to appear, thousands of years ago. So when they spoke of heroic ages, what period of history were they reminiscing about — the Stone Age?

Humans are naturally pragmatic. From hunter-gatherers, we became scavenger-survivalists. The image may be off-putting but the latter actually requires one to be more rational, clever and pragmatic. This is how we have survived and progressed.

That ancient yearning for a heroic age has remained, though. An age that never was — an age that was always an imagined one. That’s why we even mythologise known histories, because the actual in this regard can be awkward to deal with. But it is possible to unfold.

America’s ‘founding fathers’ were revered for over two centuries as untainted heroes, until some historians decided to demystify them by exploring their lives outside their mythologised imaginings. Many of these heroes turned out to be slave-owners and not very pleasant people.

Mahatma Gandhi, revered as a symbol of tolerance, turned out to also be a man who disliked black South Africans. The founder of Pakistan MA Jinnah is mythologised as a man who supposedly strived to create an ‘Islamic state’, yet the fact is that he was a declared liberal and loved his wine. Martin Luther King Jr, the revered black rights activist, was also a prolific philanderer.

When freed from mythology, the heroes become human — still important men and women, but with various flaws. This is when they become real and more relatable. They become ‘anti-heroes.’

But there is always an urgency in societies to keep the flaws hidden. The flaws can damage the emotions that are invested in revering ‘heroes’, both dead and living. The act of revering provides an opportunity to feel bigger than a scavenger-survivor, even if this requires forged memories and heavily mythologised men and women.

Therefore, hero-worship can also make one blurt out even the most absurd things to keep a popular but distorted memory of a perceived hero intact. For example, this is exactly what one populist former Pakistani prime minister did when he declared that the terrorist Osama bin Laden was a martyr.

By doing this, the former PM was signalling his own ‘heroism’ as well — that of a proud fool who saw greatness in a mass murderer to signal his own ‘greatness’ in an unheroic age.

The French philosopher Voltaire viewed this tendency as a chain that one has fallen in love with. Voltaire wrote, “It is difficult to free fools from the chains they revere.”

Monday, 7 June 2021

Just don’t do it: 10 exercise myths

We all believe we should exercise more. So why is it so hard to keep it up? Daniel E Lieberman, Harvard professor of evolutionary biology, explodes the most common and unhelpful workout myths by Daniel E Lieberman in The Guardian 


Yesterday at an outdoor coffee shop, I met my old friend James in person for the first time since the pandemic began. Over the past year on Zoom, he looked just fine, but in 3D there was no hiding how much weight he’d gained. As we sat down with our cappuccinos, I didn’t say a thing, but the first words out of his mouth were: “Yes, yes, I’m now 20lb too heavy and in pathetic shape. I need to diet and exercise, but I don’t want to talk about it!”

If you feel like James, you are in good company. With the end of the Covid-19 pandemic now plausibly in sight, 70% of Britons say they hope to eat a healthier diet, lose weight and exercise more. But how? Every year, millions of people vow to be more physically active, but the vast majority of these resolutions fail. We all know what happens. After a week or two of sticking to a new exercise regime we gradually slip back into old habits and then feel bad about ourselves.

Clearly, we need a new approach because the most common ways we promote exercise – medicalising and commercialising it – aren’t widely effective. The proof is in the pudding: most adults in high-income countries, such as the UK and US, don’t get the minimum of 150 minutes per week of physical activity recommended by most health professionals. Everyone knows exercise is healthy, but prescribing and selling it rarely works.

I think we can do better by looking beyond the weird world in which we live to consider how our ancestors as well as people in other cultures manage to be physically active. This kind of evolutionary anthropological perspective reveals 10 unhelpful myths about exercise. Rejecting them won’t transform you suddenly into an Olympic athlete, but they might help you turn over a new leaf without feeling bad about yourself.

Myth 1: It’s normal to exercise

Whenever you move to do anything, you’re engaging in physical activity. In contrast, exercise is voluntary physical activity undertaken for the sake of fitness. You may think exercise is normal, but it’s a very modern behaviour. Instead, for millions of years, humans were physically active for only two reasons: when it was necessary or rewarding. Necessary physical activities included getting food and doing other things to survive. Rewarding activities included playing, dancing or training to have fun or to develop skills. But no one in the stone age ever went for a five-mile jog to stave off decrepitude, or lifted weights whose sole purpose was to be lifted.

Myth 2: Avoiding exertion means you are lazy

Whenever I see an escalator next to a stairway, a little voice in my brain says, “Take the escalator.” Am I lazy? Although escalators didn’t exist in bygone days, that instinct is totally normal because physical activity costs calories that until recently were always in short supply (and still are for many people). When food is limited, every calorie spent on physical activity is a calorie not spent on other critical functions, such as maintaining our bodies, storing energy and reproducing. Because natural selection ultimately cares only about how many offspring we have, our hunter-gatherer ancestors evolved to avoid needless exertion – exercise – unless it was rewarding. So don’t feel bad about the natural instincts that are still with us. Instead, accept that they are normal and hard to overcome.


‘For most of us, telling us to “Just do it” doesn’t work’: exercise needs to feel rewarding as well as necessary. Photograph: Dan Saelinger/trunkarchive.com


Myth 3: Sitting is the new smoking

You’ve probably heard scary statistics that we sit too much and it’s killing us. Yes, too much physical inactivity is unhealthy, but let’s not demonise a behaviour as normal as sitting. People in every culture sit a lot. Even hunter-gatherers who lack furniture sit about 10 hours a day, as much as most westerners. But there are more and less healthy ways to sit. Studies show that people who sit actively by getting up every 10 or 15 minutes wake up their metabolisms and enjoy better long-term health than those who sit inertly for hours on end. In addition, leisure-time sitting is more strongly associated with negative health outcomes than work-time sitting. So if you work all day in a chair, get up regularly, fidget and try not to spend the rest of the day in a chair, too.

Myth 4: Our ancestors were hard-working, strong and fast

A common myth is that people uncontaminated by civilisation are incredible natural-born athletes who are super-strong, super-fast and able to run marathons easily. Not true. Most hunter-gatherers are reasonably fit, but they are only moderately strong and not especially fast. Their lives aren’t easy, but on average they spend only about two to three hours a day doing moderate-to-vigorous physical activity. It is neither normal nor necessary to be ultra-fit and ultra-strong.

Myth 5: You can’t lose weight walking

Until recently just about every weight-loss programme involved exercise. Recently, however, we keep hearing that we can’t lose weight from exercise because most workouts don’t burn that many calories and just make us hungry so we eat more. The truth is that you can lose more weight much faster through diet rather than exercise, especially moderate exercise such as 150 minutes a week of brisk walking. However, longer durations and higher intensities of exercise have been shown to promote gradual weight loss. Regular exercise also helps prevent weight gain or regain after diet. Every diet benefits from including exercise.

Myth 6: Running will wear out your knees

Many people are scared of running because they’re afraid it will ruin their knees. These worries aren’t totally unfounded since knees are indeed the most common location of runners’ injuries. But knees and other joints aren’t like a car’s shock absorbers that wear out with overuse. Instead, running, walking and other activities have been shown to keep knees healthy, and numerous high-quality studies show that runners are, if anything, less likely to develop knee osteoarthritis. The strategy to avoiding knee pain is to learn to run properly and train sensibly (which means not increasing your mileage by too much too quickly).

Myth 7: It’s normal to be less active as we age

After many decades of hard work, don’t you deserve to kick up your heels and take it easy in your golden years? Not so. Despite rumours that our ancestors’ life was nasty, brutish and short, hunter-gatherers who survive childhood typically live about seven decades, and they continue to work moderately as they age. The truth is we evolved to be grandparents in order to be active in order to provide food for our children and grandchildren. In turn, staying physically active as we age stimulates myriad repair and maintenance processes that keep our bodies humming. Numerous studies find that exercise is healthier the older we get.

Myth 8: There is an optimal dose/type of exercise

One consequence of medicalising exercise is that we prescribe it. But how much and what type? Many medical professionals follow the World Health Organisation’s recommendation of at least 150 minutes a week of moderate or 75 minutes a week of vigorous exercise for adults. In truth, this is an arbitrary prescription because how much to exercise depends on dozens of factors, such as your fitness, age, injury history and health concerns. Remember this: no matter how unfit you are, even a little exercise is better than none. Just an hour a week (eight minutes a day) can yield substantial dividends. If you can do more, that’s great, but very high doses yield no additional benefits. It’s also healthy to vary the kinds of exercise you do, and do regular strength training as you age.

Myth 9: ‘Just do it’ works


Let’s face it, most people don’t like exercise and have to overcome natural tendencies to avoid it. For most of us, telling us to “just do it” doesn’t work any better than telling a smoker or a substance abuser to “just say no!” To promote exercise, we typically prescribe it and sell it, but let’s remember that we evolved to be physically active for only two reasons: it was necessary or rewarding. So let’s find ways to do both: make it necessary and rewarding. Of the many ways to accomplish this, I think the best is to make exercise social. If you agree to meet friends to exercise regularly you’ll be obliged to show up, you’ll have fun and you’ll keep each other going.

Myth 10: Exercise is a magic bullet

Finally, let’s not oversell exercise as medicine. Although we never evolved to exercise, we did evolve to be physically active just as we evolved to drink water, breathe air and have friends. Thus, it’s the absence of physical activity that makes us more vulnerable to many illnesses, both physical and mental. In the modern, western world we no longer have to be physically active, so we invented exercise, but it is not a magic bullet that guarantees good health. Fortunately, just a little exercise can slow the rate at which you age and substantially reduce your chances of getting a wide range of diseases, especially as you age. It can also be fun – something we’ve all been missing during this dreadful pandemic.

Monday, 25 January 2021

As Joe Biden moves to double the US minimum wage, Australia can't be complacent

Van Badham in The Guardian

When I was writing about minimum wages for the Guardian six years ago, the United States only guaranteed workers US$7.25 an hour as a minimum rate of pay, dropping to a shocking US$2.13 for workers in industries that expect customers to tip (some states have higher minimum wages).

It is now 2021, and yet those federal rates remain exactly the same.

They’ve not moved since 2009. Meaningfully, America’s minimum wages have been in decline since their relative purchasing power peaked in 1968. Meanwhile, America’s cost of living has kept going up; the minimum wage is worth less now than it was half a century ago.

Now, new president Joe Biden’s $1.9tn pandemic relief plan proposes a doubling of the US federal minimum wage to $15 an hour.


 
It’s a position advocated both by economists who have studied comprehensive, positive effects of minimum wage increases across the world, as well as American unions of the “Fight for 15” campaign who’ve been organising minimum-wage workplaces demanding better for their members.

The logic of these arguments have been accepted across the ideological spectrum of leadership in Biden’s Democratic party. The majority of Biden’s rivals for the Democratic nomination – Bernie Sanders, Elizabeth Warren, Kamala Harris, Pete Buttigieg, Amy Klobuchar, Cory Booker and even billionaire capitalist Mike Bloomberg – are all on record supporting it and in very influential positions to advance it now.

In a 14 January speech, Biden made a simple and powerful case. “No one working 40 hours a week should live below the poverty line,” he said. “If you work for less than $15 an hour and work 40 hours a week, you’re living in poverty.”

And yet the forces opposed to minimum wage increases retain the intensity that first fought attempts at its introduction, as far back as the 1890s. America did not adopt the policy until 1938 – 31 years after Australia’s Harvester Decision legislated an explicit right for a family of four “to live in frugal comfort” within our wage standards. 

As an Australian, it’s easy to feel smug about our framework. The concept is so ingrained within our basic industrial contract we consume it almost mindlessly, in the manner our cousins might gobble a hotdog in the stands of a Sox game.

But in both cases, the appreciation of the taste depends on your level of distraction from the meat. While wage-earning Australians may tut-tut an American framework that presently allows 7 million people to both hold jobs and live in poverty, local agitation persists for the Americanisation of our own established standards.

When I wrote about minimum wages six years ago, it was in the context of Australia’s Liberal government attempting to erode and compromise them. That government is still in power and that activism from the Liberals and their spruikers is still present. The Australian Chamber of Commerce and Industry campaigned against minimum wage increases last year. So did the federal government – using the economic downtown of coronavirus as a foil to repeat American mythologies about higher wages causing unemployment increases.


 

They don’t. The “supply side” insistence is that labour is a transactable commodity, and therefore subject to a law of demand in which better-paid jobs equate to fewer employment opportunities … but a neoclassical economic model is not real life.

We know this because some American districts have independently increased their minimum wages over the past few years, and data from places like New York and Seattle has reaffirmed what’s been observed in the UK and internationally. There is no discernible impact on employment when minimum wage is increased. An impact on prices is also fleeting.

As Biden presses his case, economists, sociologists and even health researchers have years of additional data to back him in. Repeated studies have found that increasing the minimum wage results in communities having less crime, less poverty, less inequality and more economic growth. One study suggested it helped bring down the suicide rate. Conversely, with greater wage suppression comes more smoking, drinking, eating of fatty foods and poorer health outcomes overall.

Only the threadbare counter-argument remains that improving the income of “burger-flippers” somehow devalues the labour of qualified paramedics, teachers and ironworkers. This is both classist and weak. Removing impediments to collective bargaining and unionisation is actually what enables workers – across all industries – to negotiate an appropriate pay level.

Australians have been living with the comparative benefits of these assumptions for decades, and have been spared the vicissitudes of America’s boom-bust economic cycles in that time.

But after seven years of Liberal government policy actively corroding standards into a historical wage stagnation, if Biden’s proposals pass, the American minimum wage will suddenly leapfrog Australias, in both real dollar terms and purchasing power.

It’ll be a sad day of realisation for Australia to see the Americans overtake us, while we try to comprehend just why we decided to get left behind.

Friday, 19 June 2020

Why you should go animal-free: 18 arguments for eating meat debunked

Unpalatable as it may be for those wedded to producing and eating meat, the environmental and health evidence for a plant-based diet is clear writes Damian Carrington in The Guardian

Whether you are concerned about your health, the environment or animal welfare, scientific evidence is piling up that meat-free diets are best. Millions of people in wealthy nations are already cutting back on animal products.

Of course livestock farmers and meat lovers are unsurprisingly fighting back and it can get confusing. Are avocados really worse than beef? What about bee-massacring almond production?

The coronavirus pandemic has added another ingredient to that mix. The rampant destruction of the natural world is seen as the root cause of diseases leaping into humans and is largely driven by farming expansion. The world’s top biodiversity scientists say even more deadly pandemics will follow unless the ecological devastation is rapidly halted.

Food is also a vital part of our culture, while the affordability of food is an issue of social justice. So there isn’t a single perfect diet. But the evidence is clear: whichever healthy and sustainable diet you choose, it is going to have much less red meat and dairy than today’s standard western diets, and quite possibly none. That’s for two basic reasons.






First, the over-consumption of meat is causing an epidemic of disease, with about $285bn spent every year around the world treating illness caused by eating red meat alone. 
Second, eating plants is simply a far more efficient use of the planet’s stretched resources than feeding the plants to animals and then eating them. The global livestock herd and the grain it consumes takes up 83% of global farmland, but produces just 18% of food calories.

So what about all those arguments in favour of meat-eating and against vegan diets? Let’s start with the big beef about red meat.

Meaty matters

Claim: Grass-fed beef is low carbon

This is true only when compared to intensively-reared beef linked to forest destruction. The UK’s National Farmers Union says UK beef has only half the emissions compared to the world average. But a lot of research shows grass-fed beef uses more land and produces more – or at best similar – emissions because grain is easier for cows to digest and intensively reared cows live shorter lives. Both factors mean less methane. Either way, the emissions from even the best beef are still many times that from beans and pulses.

There’s more. If all the world’s pasture lands were returned to natural vegetation, it would remove greenhouse gases equivalent to about 8 bn tonnes of carbon dioxide per year from the atmosphere, according to Joseph Poore at Oxford University. That’s about 15% of the world’s total greenhouse gas emissions. Only a small fraction of that pasture land would be needed to grow food crops to replace the lost beef. So overall, if tackling the climate crisis is your thing, then beef is not.

Claim: Cattle are actually neutral for climate, because methane is relatively short-lived greenhouse gas


FacebookTwitterPinterest Cattle graze in a pasture against a backdrop of wind turbines which are part of the 155 turbine Smoky Hill Wind Farm near Vesper, Kan. Photograph: Charlie Riedel/AP

Methane is a very powerful greenhouse gas and ruminants produce a lot of it. But it only remains in the atmosphere for a relatively short time: half is broken down in nine years. This leads some to argue that maintaining the global cattle herd at current levels – about 1 billion animals – is not heating the planet. The burping cows are just replacing the methane that breaks down as time goes by.

But this is simply “creative accounting”, according to Pete Smith at the University of Aberdeen and Andrew Balmford at the University of Cambridge. We shouldn’t argue that cattle farmers can continue to pollute just because they have done so in the past, they say: “We need to do more than just stand still.” In fact, the short-lived nature of methane actually makes reducing livestock numbers a “particularly attractive target”, given that we desperately need to cut greenhouse gas emissions as soon as possible to avoid the worst impacts of the climate crisis.

In any case, just focusing on methane doesn’t make the rampant deforestation by cattle ranchers in South America go away. Even if you ignore methane completely, says Poore, animal products still produce more CO2 than plants. Even one proponent of the methane claim says: “I agree that intensive livestock farming is unsustainable.”

Claim: In many places the only thing you can grow is grass for cattle and sheep

NFU president, Minette Batters, says: “Sixty-five percent of British land is only suitable for grazing livestock and we have the right climate to produce high-quality red meat and dairy.”

“But if everybody were to make the argument that ‘our pastures are the best and should be used for grazing’, then there would be no way to limit global warming,” says Marco Springmann at the University of Oxford. His work shows that a transition to a predominantly plant-based flexitarian diet would free up both pasture and cropland.

The pasture could instead be used to grow trees and lock up carbon, provide land for rewilding and the restoration of nature, and growing bio-energy crops to displace fossil fuels. The crops no longer being fed to animals could instead become food for people, increasing a nation’s self-sufficiency in grains.


FacebookTwitterPinterest The Wild Ken Hill project on the Norfolk coast, which is turning around 1,000 acres of marginal farmland and woodland back over to nature. Photograph: Graeme Lyons/Wild Ken Hill/PA

Claim: Grazing cattle help store carbon from the atmosphere in the soil

This is true. The problem is that even in the very best cases, this carbon storage offsets only 20%-60% of the total emissions from grazing cattle. “In other words, grazing livestock – even in a best-case scenario – are net contributors to the climate problem, as are all livestock,” says Tara Garnett, also at the University of Oxford.

Furthermore, research shows this carbon storage reaches its limit in a few decades, while the problem of methane emissions continue. The stored carbon is also vulnerable - a change in land use or even a drought can see it released again. Proponents of “holistic grazing” to trap carbon are also criticised for unrealistic extrapolation of local results to global levels.

Claim: There is much more wildlife in pasture than in monoculture cropland

That is probably true but misses the real point. A huge driver of the global wildlife crisis is the past and continuing destruction of natural habitat to create pasture for livestock. Herbivores do have an important role in ecosystems, but the high density of farmed herds means pasture is worse for wildlife than natural land. Eating less meat means less destruction of wild places and cutting meat significantly would also free up pasture and cropland that could be returned to nature. Furthermore, a third of all cropland is used to grow animal feed.

Claim: We need animals to convert feed into protein humans can eat

There is no lack of protein, despite the claims. In rich nations, people commonly eat 30-50% more protein than they need. All protein needs can easily be met from plant-based sources, such as beans, lentils, nuts and whole grains.

But animals can play a role in some parts of Africa and Asia where, in India for example, waste from grain production can feed cattle that produce milk. In the rest of the world, where much of cropland that could be used to feed people is actually used to feed animals, a cut in meat eating is still needed for agriculture to be sustainable.
“What about … ?”

Claim: What about soya milk and tofu that is destroying the Amazon?

It’s not. Well over 96% of soy from the Amazon region is fed to cows, pigs and chickens eaten around the world, according to data from the UN Food and Agriculture Organization, says Poore. Furthermore, 97% of Brazilian soy is genetically modified, which is banned for human consumption in many countries and is rarely used to make tofu and soya milk in any case.

Soya milk also has much smaller emissions and land and water use than cow’s milk. If you are worried about the Amazon, not eating meat remains your best bet.

Claim: Almond milk production is massacring bees and turning land into desert

Some almond production may well cause environmental problems. But that is because rising demand has driven rapid intensification in specific places, like California, which could be addressed with proper regulation. It is nothing to do with what almonds need to grow. Traditional almond production in Southern Europe uses no irrigation at all. It is also perhaps worth noting that the bees that die in California are not wild, but raised by farmers like six-legged livestock.

Like soya milk, almond milk still has lower carbon emissions and land and water use than cow’s milk. But if you are still worried, there are plenty of alternatives, with oat milk usually coming out with the lowest environmental footprint.
Claim: Avocados are causing droughts in places 

Again, the problem here is the rapid growth of production in specific regions that lack prudent controls on water use, like Peru and Chile. Avocados generate three times less emissions than chicken, four times less than pork, and 20 times less than beef.

If you are still worried about avocados, you can of course choose not to eat them. But it’s not a reason to eat meat instead, which has a much bigger water and deforestation footprint.

The market is likely to solve the problem, as the high demand from consumers for avocados and almonds incentivises farmers elsewhere to grow the crops, thereby alleviating the pressure on current production hotspots.

Claim: Quinoa boom is harming poor farmers in Peru and Bolivia

Quinoa is an amazing food and has seen a boom. But the idea that this took food from the mouths of poor farmers is wrong. “The claim that rising quinoa prices were hurting those who had traditionally produced and consumed it is patently false,” said researchers who studied the issue.

Quinoa was never a staple food, representing just a few percent of the food budget for these people. The quinoa boom has had no effect on their nutrition. The boom also significantly boosted the farmers’ income.

There is an issue with falling soil quality, as the land is worked harder. But quinoa is now planted in China, India and Nepal, as well as in the US and Canada, easing the burden. The researchers are more worried now about the loss of income for South American farmers as the quinoa supply rises and the price falls.

Claim: What about palm oil destroying rainforests and orangutans?

Palm oil plantations have indeed led to terrible deforestation. But that is an issue for everybody, not only vegans: it’s in about half of all products on supermarket shelves, both food and toiletries. The International Union for the Conservation of Nature argues that choosing sustainably produced palm oil is actually positive, because other oil crops take up more land.

But Poore says: “We are abandoning millions of acres a year of oilseed land around the world, including rapeseed and sunflower fields in the former Soviet regions, and traditional olive plantations.” Making better use of this land would be preferable to using palm oil, he says.
Healthy questions

Claim: Vegans don’t get enough B12, making them stupid

A vegan diet is generally very healthy, but doctors have warned about the potential lack of B12, an important vitamin for brain function that is found in meat, eggs and cows’ milk. This is easily remedied by taking a supplement.

However, a closer look reveals some surprises. B12 is made by bacteria in soil and the guts of animals, and free-range livestock ingest the B12 as they graze and peck the ground. But most livestock are not free-range, and pesticides and antibiotics widely used on farms kill the B12-producing bugs. The result is that most B12 supplements - 90% according to one source – are fed to livestock, not people.

So there’s a choice here between taking a B12 supplement yourself, or eating an animal that has been given the supplement. Algae are a plant-based source of B12, although the degree of bio-availability is not settled yet. It is also worth noting that a significant number of non-vegans are B12 deficient, especially older people. Among vegans the figure is only about 10%.

Claim: Plant-based alternatives to meat are really unhealthy 

The rapid rise of the plant-based burger has prompted some to criticise them as ultra-processed junk food. A plant-based burger could be unhealthier if the salt levels are very high, says Springmann, but it is most likely to still be healthier than a meat burger when all nutritional factors are considered, particularly fibre. Furthermore, replacing a beef burger with a plant-based alternative is certain to be less damaging to the environment.

There is certainly a strong argument to be made that overall we eat far too much processed food, but that applies just as much to meat eaters as to vegetarians and vegans. And given that most people are unlikely to give up their burgers and sausages any time, the plant-based options are a useful alternative.
‘Catching out’ vegans

Claim: Fruit and vegetables aren’t vegan because they rely on animal manure as fertiliser

Most vegans would say it’s just silly to say fruit and veg are animal products and plenty are produced without animal dung. In any case there is no reason for horticulture to rely on manure at all. Synthetic fertiliser is easily made from the nitrogen in the air and there is plenty of organic fertiliser available if we chose to use it more widely in the form of human faeces. Over application of fertiliser does cause water pollution problems in many parts of the world. But that applies to both synthetic fertiliser and manure and results from bad management.

Claim: Vegan diets kill millions of insects

Piers Morgan is among those railing against “hypocrite” vegans because commercially kept bees die while pollinating almonds and avocados and combine harvesters “create mass murder of bugs” and small mammals while bringing in the grain harvest. But almost everyone eats these foods, not just vegans.

It is true that insects are in a terrible decline across the planet. But the biggest drivers of this are the destruction of wild habitat, largely for meat production, and widespread pesticide use. If it is insects that you are really worried about, then eating a plant-based organic diet is the option to choose.

Claim: Telling people to eat less meat and dairy is denying vital nutrition to the world’s poorest

A “planetary health diet” published by scientists to meet both global health and environmental needs was criticised by journalist Joanna Blythman: “When ideologues living in affluent countries pressurise poor countries to eschew animal foods and go plant-based, they are displaying crass insensitivity, and a colonial White Saviour mindset.”

In fact, says Springmann, who was part of the team behind the planetary health diet, it would improve nutritional intake in all regions, including poorer regions where starchy foods currently dominate diets. The big cuts in meat and dairy are needed in rich nations. In other parts of the world, many healthy, traditional diets are already low in animal products.
On the road

Claim: Transport emissions mean that eating plants from all over the world is much worse than local meat and dairy

“‘Eating local’ is a recommendation you hear often [but] is one of the most misguided pieces of advice,” says Hannah Ritchie, at the University of Oxford. “Greenhouse gas emissions from transportation make up a very small amount of the emissions from food and what you eat is far more important than where your food traveled from.”

Beef and lamb have many times the carbon footprint of most other foods, she says. So whether the meat is produced locally or shipped from the other side of the world, plants will still have much lower carbon footprints. Transport emissions for beef are about 0.5% of the total and for lamb it’s 2%.

The reason for this is that almost all food transported long distances is carried by ships, which can accommodate huge loads and are therefore fairly efficient. For example, the shipping emissions for avocados crossing the Atlantic are about 8% of their total footprint. Air freight does of course result in high emissions, but very little food is transported this way; it accounts for just 0.16% of food miles.

Claim: All the farmers who raise livestock would be unemployed if the world went meat-free

Livestock farming is massively subsidised with taxpayers money around the world – unlike vegetables and fruit. That money could be used to support more sustainable foods such as beans and nuts instead, and to pay for other valuable services, such as capturing carbon in woodlands and wetlands, restoring wildlife, cleaning water and reducing flood risks. Shouldn’t your taxes be used to provide public goods rather than harms?

So, food is complicated. But however much we might wish to continue farming and eating as we do today, the evidence is crystal clear that consuming less meat and more plants is very good for both our health and the planet. The fact that some plant crops have problems is not a reason to eat meat instead.

In the end, you will choose what you eat. If you want to eat healthily and sustainably, you don’t have to stop eating meat and dairy altogether. The planetary health diet allows for a beef burger, some fish and an egg each week, and a glass of milk or some cheese each day.

Food writer Michael Pollan foreshadowed the planetary health diet in 2008 with a simple seven-word rule: “Eat food. Not too much. Mostly plants.” But if you want to have the maximum impact on fighting the climate and wildlife crisis, then it is going to be all plants.

Thursday, 14 March 2019

Meritocracy is a myth invented by the rich

The college admissions scandal is a reminder that wealth, not talent, is what determines the opportunities you have in life writes Nathan Robinson in The Guardian 

 
‘There can be never be such thing as a meritocracy, because there’s never going to be fully equal opportunity.’ Photograph: Dan Kitwood/Getty Images


The US college admissions scandal is fascinating, if not surprising. Over 30 wealthy parents have been criminally charged over a scheme in which they allegedly paid a company large sums of money to get their children into top universities. The duplicity involved was extreme: everything from paying off university officials to inventing learning disabilities to facilitate cheating on standardized tests. One father even faked a photo of his son pole vaulting in order to convince admissions officers that the boy was a star athlete.

It’s no secret that wealthy people will do nearly anything to get their kids into good schools. But this scandal only begins to reveal the lies that sustain the American idea of meritocracy. William “Rick” Singer, who admitted to orchestrating the scam, explained that there are three ways in which a student can get into the college of their choice: “There is a front door which is you get in on your own. The back door is through institutional advancement, which is ten times as much money. And I’ve created this side door.” The “side door” he’s referring to is outright crime, literally paying bribes and faking test scores. It’s impossible to know how common that is, but there’s reason to suspect it’s comparatively rare. Why? Because for the most part, the wealthy don’t need to pay illegal bribes. They can already pay perfectly legal ones.


In his 2006 book, The Price of Admission: How America’s Ruling Class Buys Its Way into Elite Colleges, Daniel Golden exposes the way that the top schools favor donors and the children of alumni. A Duke admissions officer recalls being given being given a box of applications she had intended to reject, but which were returned to her for “special” reconsideration. In cases where parents are expected to give very large donations upon a student’s admission, the applicant may be described as an “institutional development” candidate—letting them in would help develop the institution. Everyone by now is familiar with the way the Kushner family bought little Jared a place at Harvard. It only took $2.5m to convince the school that Jared was Harvard material.

The inequality goes so much deeper than that, though. It’s not just donations that put the wealthy ahead. Children of the top 1% (and the top 5%, and the top 20%) have spent their entire lives accumulating advantages over their counterparts at the bottom. Even in first grade the differences can be stark: compare the learning environment at one of Detroit’s crumbling public elementary schools to that at a private elementary school that costs tens of thousands of dollars a year. There are high schools, such as Phillips Academyin Andover, Massachusetts, that have billion dollar endowments. Around the country, the level of education you receive depend on how much money your parents have.


Even if we equalized public school funding, and abolished private schools, some children would be far more equal than others. 2.5m children in the United States go through homelessness every year in this country. The chaotic living situation that comes with poverty makes it much, much harder to succeed. This means that even those who go through Singer’s “front door” have not “gotten in on their own.” They’ve gotten in partly because they’ve had the good fortune to have a home life conducive to their success.

People often speak about “equality of opportunity” as the American aspiration. But having anything close to equal opportunity would require a radical re-engineering of society from top to bottom. As long as there are large wealth inequalities, there will be colossal differences in the opportunities that children have. No matter what admissions criteria are set, wealthy children will have the advantage. If admissions officers focus on test scores, parents will pay for extra tutoring and test prep courses. If officers focus instead on “holistic” qualities, pare. It’s simple: wealth always confers greater capacity to give your children the edge over other people’s children. If we wanted anything resembling a “meritocracy,” we’d probably have to start by instituting full egalitarian communism.

In reality, there can be never be such thing as a meritocracy, because there’s never going to be fully equal opportunity. The main function of the concept is to assure elites that they deserve their position in life. It eases the “anxiety of affluence,” that nagging feeling that they might be the beneficiaries of the arbitrary “birth lottery” rather than the products of their own individual ingenuity and hard work.

There’s something perverse about the whole competitive college system. But we can imagine a different world. If everyone was guaranteed free, high-quality public university education, and a public school education matched the quality of a private school education, there wouldn’t be anything to compete for.
Instead of the farce of the admissions process, by which students have to jump through a series of needless hoops in order to prove themselves worthy of being given a good education, just admit everyone who meets a clearly-established threshold for what it takes to do the coursework. It’s not as if the current system is selecting for intelligence or merit. The school you went to mostly tells us what economic class your parents were in. But it doesn’t have to be that way.

Friday, 19 October 2018

The myth of meritocracy: who really gets what they deserve?

Kwame Anthony Appiah in The Guardian


Michael Young was an inconvenient child. His father, an Australian, was a musician and music critic, and his mother, who grew up in Ireland, was a painter of a bohemian bent. They were hard-up, distractible and frequently on the outs with each other. Michael, born in 1915 in Manchester, soon found that neither had much time for him. Once when his parents had seemingly forgotten his birthday, he imagined that he was in for a big end-of-day surprise. But no, they really had forgotten his birthday, which was no surprise at all. He overheard his parents talk about putting him up for adoption and, by his own account, never fully shed his fear of abandonment.

Everything changed for him when, at the age of 14, he was sent to an experimental boarding school at Dartington Hall in Devon. It was the creation of the great progressive philanthropists Leonard and Dorothy Elmhirst, and it sought to change society by changing souls. There it was as if he had been put up for adoption, because the Elmhirsts treated him as a son, encouraging and supporting him for the rest of their lives. Suddenly he was a member of the transnational elite: dining with President Roosevelt, listening in on a conversation between Leonard and Henry Ford.

Young, who has been called the greatest practical sociologist of the past century, pioneered the modern scientific exploration of the social lives of the English working class. He did not just aim to study class, though; he aimed to ameliorate the damage he believed it could do. The Dartington ideal was about the cultivation of personality and aptitudes whatever form they took, and the British class structure plainly impeded this ideal. What would supplant the old, caste-like system of social hierarchy? For many today, the answer is “meritocracy” – a term that Young himself coined 60 years ago. Meritocracy represents a vision in which power and privilege would be allocated by individual merit, not by social origins.

Inspired by the meritocratic ideal, many people these days are committed to a view of how the hierarchies of money and status in our world should be organised. We think that jobs should go not to people who have connections or pedigree, but to those best qualified for them, regardless of their background. Occasionally, we will allow for exceptions – for positive discrimination, say, to help undo the effects of previous discrimination. But such exceptions are provisional: when the bigotries of sex, race, class and caste are gone, the exceptions will cease to be warranted. We have rejected the old class society. In moving toward the meritocratic ideal, we have imagined that we have retired the old encrustations of inherited hierarchies. As Young knew, that is not the real story.

Young hated the term “welfare state” – he said that it smelled of carbolic – but before he turned 30 he had helped create one. As the director of the British Labour party’s research office, he drafted large parts of the manifesto on which the party won the 1945 election. The manifesto, “Let Us Face the Future”, called for “the establishment of the Socialist Commonwealth of Great Britain – free, democratic, efficient, progressive, public-spirited, its material resources organised in the service of the British people”. Soon the party, as it promised, raised the school-leaving age to 16, increased adult education, improved public housing, made public secondary school education free, created a national health service and provided social security for all.

As a result, the lives of the English working class were beginning to change radically for the better. Unions and labour laws reduced the hours worked by manual labourers, increasing their possibilities of leisure. Rising incomes made it possible for them to buy televisions and refrigerators. And changes, partly driven by new estate taxes, were going on at the top of the income hierarchy, too. In 1949, the Labour chancellor of the exchequer, Stafford Cripps, introduced a tax that rose to 80% on estates of £1m and above, or about £32m in contemporary inflation-adjusted terms. (Disclosure: I’m a grandson of his.) For a couple of generations afterward, these efforts at social reform both protected members of the working classes and allowed more of their children to make the move up the hierarchy of occupations and of income, and so, to some degree, of status. Young was acutely conscious of these accomplishments; he was acutely conscious, too, of their limitations.

Just as happened in the US, college attendance shot up in Britain after the second world war, and one of the main indicators of class was increasingly whether you had been to university. The middle-class status of meagerly compensated librarians reflected a vocational requirement for an education beyond secondary school; that the better-paid assembly-line workers were working-class reflected the absence of such a requirement. Working-class consciousness – legible in the very name of the Labour party, founded in 1900 – spoke of class mobilisation, of workers securing their interests. The emerging era of education, by contrast, spoke of class mobility – blue collars giving way to white. Would mobility undermine class consciousness?

These questions preyed on Young. Operating out of a community studies institute he set up in Bethnal Green, he helped create and nurture dozens and dozens of programmes and organisations, all attending to social needs he had identified. The Consumers’ Association was his brainchild, along with its magazine, Which?. So was the Open University, which has taught more than 2 million students since Young founded it in 1969, making it the largest academic institution in the UK by enrolment. Yet education mattered to him not just as a means of mobility, but as a way to make people more forceful as citizens, whatever their station – less easily bulldozed by commercial developers or the government planners of Whitehall. Late in life, he even set up the School for Social Entrepreneurs. Over the decades, he wanted to strengthen the social networks – the “social capital”, as social scientists say these days – of communities that get pushed around by those who were increasingly claiming a lion’s share of society’s power and wealth.

What drove him was his sense that class hierarchies would resist the reforms he helped implement. He explained how it would happen in a 1958 satire, his second best-seller, entitled The Rise of the Meritocracy. Like so many phenomena, meritocracy was named by an enemy. Young’s book was ostensibly an analysis written in 2033 by a historian looking back at the development over the decades of a new British society. In that distant future, riches and rule were earned, not inherited. The new ruling class was determined, the author wrote, by the formula “IQ + effort = merit”. Democracy would give way to rule by the cleverest – “not an aristocracy of birth, not a plutocracy of wealth, but a true meritocracy of talent.” This is the first published appearance of the word “meritocracy”, and the book aimed to show what a society governed on this principle would look like.

 
‘Education mattered to Young not just as a means of mobility, but as a way to make people more forceful as citizens, whatever their station.’ Photograph: Getty

Young’s vision was decidedly dystopian. As wealth increasingly reflects the innate distribution of natural talent, and the wealthy increasingly marry one another, society sorts into two main classes, in which everyone accepts that they have more or less what they deserve. He imagined a country in which “the eminent know that success is a just reward for their own capacity, their own efforts”, and in which the lower orders know that they have failed every chance they were given. “They are tested again and again … If they have been labelled ‘dunce’ repeatedly they cannot any longer pretend; their image of themselves is more nearly a true, unflattering reflection.”

But one immediate difficulty was that, as Young’s narrator concedes, “nearly all parents are going to try to gain unfair advantages for their offspring”. And when you have inequalities of income, one thing people can do with extra money is to pursue that goal. If the financial status of your parents helped determine your economic rewards, you would no longer be living by the formula that “IQ + effort = merit”.

Those cautions have, of course, proved well founded. In the US, the top fifth of households enjoyed a $4tn increase in pretax income between 1979 and 2013 – $1tn more than came to all the rest. When increased access to higher education was introduced in the US and Britain, it was seen as a great equaliser. But a couple of generations later, researchers tell us that higher education is now a great stratifier. Economists have found that many elite US universities – including Brown, Dartmouth, Penn, Princeton, and Yale – take more students from the top 1% of the income distribution than from the bottom 60%. To achieve a position in the top tier of wealth, power and privilege, in short, it helps enormously to start there. “American meritocracy,” the Yale law professor Daniel Markovits argues, has “become precisely what it was invented to combat: a mechanism for the dynastic transmission of wealth and privilege across generations.”

Young, who died in 2002 at the age of 86, saw what was happening. “Education has put its seal of approval on a minority,” he wrote, “and its seal of disapproval on the many who fail to shine from the time they are relegated to the bottom streams at the age of seven or before.” What should have been mechanisms of mobility had become fortresses of privilege. He saw an emerging cohort of mercantile meritocrats who can be insufferably smug, much more so than the people who knew they had achieved advancement not on their own merit but because they were, as somebody’s son or daughter, the beneficiaries of nepotism. The newcomers can actually believe they have morality on their side. So assured have the elite become that there is almost no block on the rewards they arrogate to themselves.

The carapace of “merit”, Young argued, had only inoculated the winners from shame and reproach.

Americans, unlike the British, don’t talk much about working-class consciousness; it is sometimes said that all Americans are, by self-conception, middle class. But this, it turns out, is not currently what Americans themselves think. In a 2014 National Opinion Research Center survey, more Americans identified as working-class than as middle-class. One (but only one) strand of the populism that tipped Donald Trump into power expressed resentment toward a class defined by its education and its values: the cosmopolitan, degree-laden people who dominate the media, the public culture and the professions in the US. Clinton swept the 50 most educated counties, as Nate Silver noted shortly after the 2016 election; Trump swept the 50 least. Populists think that liberal elites look down on ordinary Americans, ignore their concerns and use their power to their own advantage. They may not call them an upper class, but the indices that populists use to define them – money, education, connections, power – would have picked out the old upper and upper-middle classes of the last century.

And many white working-class voters feel a sense of subordination, derived from a lack of formal education, and that can play a part in their politics. Back in the early 1970s, the sociologists Richard Sennett and Jonathan Cobb recorded these attitudes in a study memorably titled The Hidden Injuries of Class. This sense of vulnerability is perfectly consistent with feeling superior in other ways. Working-class men often think that middle-class and upper-class men are unmanly or undeserving. Still, a significant portion of what we call the American white working class has been persuaded that, in some sense, they do not deserve the opportunities that have been denied to them.

They may complain that minorities have unfair advantages in the competition for work and the distribution of government benefits. Nevertheless, they do not think it is wrong either that they do not get jobs for which they believe they are not qualified, or that the jobs for which they are qualified are typically less well paid. They think minorities are getting “handouts” – and men may feel that women are getting unfair advantages, too – but they don’t think the solution is to demand handouts for themselves. They are likely to regard the treatment of racial minorities as an exception to the right general rule: they think the US mostly is and certainly should be a society in which opportunities belong to those who have earned them.

 
A still from an Open University maths lecture, first broadcast in January 1971. Photograph: Open University/PA

If a new dynastic system is nonetheless taking shape, you might conclude that meritocracy has faltered because, as many complain, it isn’t meritocratic enough. If talent is capitalised efficiently only in high tax brackets, you could conclude that we have simply failed to achieve the meritocratic ideal. Maybe it is not possible to give everyone equally good parenting, but you could push more rigorously for merit, making sure every child has the educational advantages and is taught the social tricks that successful families now hoard for their children. Why isn’t that the right response?

Because, Young believed, the problem was not just with how the prizes of social life were distributed; it was with the prizes themselves. A system of class filtered by meritocracy would, in his view, still be a system of class: it would involve a hierarchy of social respect, granting dignity to those at the top, but denying respect and self-respect to those who did not inherit the talents and the capacity for effort that, combined with proper education, would give them access to the most highly remunerated occupations. This is why the authors of his fictional Chelsea Manifesto – which, in The Rise of the Meritocracy, is supposed to serve as the last sign of resistance to the new order – ask for a society that “both possessed and acted upon plural values”, including kindliness, courage and sensitivity, so all had a chance to “develop his own special capacities for leading a rich life”. Even if you were somehow upholding “IQ + effort = merit”, then your equation was sponsoring a larger inequality.

This alternative vision, in which each of us takes our allotment of talents and pursues a distinctive set of achievements and the self-respect they bring, was one that Young had learned from his schooling at Dartington Hall. And his profound commitment to social equality can seem, in the mode of schoolhouse utopias, quixotic. Yet it draws on a deeper philosophical picture. The central task of ethics is to ask what it is for a human life to go well. A plausible answer is that living well means meeting the challenge set by three things: your capacities, the circumstances into which you were born, and the projects that you yourself decide are important. Because each of us comes equipped with different talents and is born into different circumstances, and because people choose their own projects, each of us faces his or her own challenge. There is no comparative measure that would enable an assessment of whether your life or my life is better; Young was right to protest the idea that “people could be put into rank order of worth”. What matters in the end is not how we rank against others. We do not need to find something that we do better than anyone else; what matters, to the Dartingtonians, is simply that we do our best.

The ideal of meritocracy, Young understood, confuses two different concerns. One is a matter of efficiency; the other is a question of human worth. If we want people to do difficult jobs that require talent, education, effort, training and practice, we need to be able to identify candidates with the right combination of aptitude and willingness and provide them incentives to train and practice.


 Michael Young in 1949. Photograph: Getty

Because there will be a limited supply of educational and occupational opportunities, we will have to have ways of allocating them – some principles of selection to match people to positions, along with appropriate incentives to ensure the necessary work gets done. If these principles of selection have been reasonably designed, we can say, if we like, that the people who meet the criteria for entering the schools or getting the jobs “merit” those positions. This is, to enlist some useful philosophers’ jargon, a matter of “institutional desert”. People deserve these positions in the sense in which people who buy winning lottery tickets deserve their winnings: they got them by a proper application of the rules.

Institutional desert, however, has nothing to do with the intrinsic worthiness of the people who get into college or who get the jobs, any more than lottery winners are people of special merit and losers are somehow less worthy. Even on the highest levels of achievement, there is enormous contingency at play. If Einstein had been born a century earlier, he might have made no momentous contributions to his field; a Mozart who came of age in the early 20th century and trained on 12-tone rows might not have done so either. Neither might have made much use of their aptitudes had they grown up among the Amazonian Nukak.

And, of course, the capacity for hard work is itself the result of natural endowments and upbringing. So neither talent nor effort, the two things that would determine rewards in the world of the meritocracy, is itself something earned. People who have, as The Rise of the Meritocracy bluntly put it, been repeatedly “labelled ‘dunce’” still have capacities and the challenge of making a meaningful life. The lives of the less successful are not less worthy than those of others, but not because they are as worthy or more worthy. There is simply no sensible way of comparing the worth of human lives.

Put aside the vexed notion of “merit”, and a simpler picture emerges. Money and status are rewards that can encourage people to do the things that need doing. A well-designed society will elicit and deploy developed talent efficiently. The social rewards of wealth and honour are inevitably going to be unequally shared, because that is the only way they can serve their function as incentives for human behaviour. But we go wrong when we deny not only the merit but the dignity of those whose luck in the genetic lottery and in the historical contingencies of their situation has left them less rewarded.

Yes, people will inevitably want to share both money and status with those they love, seeking to get their children financial and social rewards. But we should not secure our children’s advantages in a way that denies a decent life to the children of others. Each child should have access to a decent education, suitable to her talents and her choices; each should be able to regard him- or herself with self-respect. Further democratising the opportunities for advancement is something we know how to do, even if the state of current politics in Britain and the US has made it increasingly unlikely that it will be done anytime soon. But such measures were envisaged in Young’s meritocratic dystopia, where inheritance was to hold little sway. His deeper point was that we also need to apply ourselves to something we do not yet quite know how to do: to eradicate contempt for those who are disfavoured by the ethic of effortful competition.

“It is good sense to appoint individual people to jobs on their merit,” Young wrote. “It is the opposite when those who are judged to have merit of a particular kind harden into a new social class without room in it for others.” The goal is not to eradicate hierarchy and to turn every mountain into a salt flat; we live in a plenitude of incommensurable hierarchies, and the circulation of social esteem will always benefit the better novelist, the more important mathematician, the savvier businessman, the faster runner, the more effective social entrepreneur. We cannot fully control the distribution of economic, social and human capital, or eradicate the intricate patterns that emerge from these overlaid grids. But class identities do not have to internalise those injuries of class. It remains an urgent collective endeavour to revise the ways we think about human worth in the service of moral equality.

This can sound utopian, and, in its fullest conception, it undoubtedly is. Yet nobody was more practical-minded than Young, institution-builder par excellence. It is true that the stirrings of Young’s conscience responded to the personal as well as the systemic; dying of cancer in a hospital ward, he worried whether the contractor-supplied African immigrants who wheeled around the food trolleys were getting minimum wage. But his compassion was welded to a sturdy sense of the possible. He did not merely dream of reducing inherited privilege; he devised concrete measures to see that it happened, in the hope that all citizens could have the chance to develop their “own special capacities for leading a rich life”. He had certainly done exactly that himself. In the imaginary future of The Rise of the Meritocracy, there was still a House of Lords, but it was occupied solely by people who had earned their places there through distinguished public service. If anyone had merited a place in that imaginary legislature, it would have been him.


That was far from true of the House of Lords he grew up with, which was probably one reason why his patron Leonard Elmhirst declined a peerage when offered one in the 1940s; in the circles he moved in, he made clear, “acceptance would neither be easy for me to explain nor easy for my friends to comprehend”. So it is more than a little ironic that when Young, the great egalitarian, was offered a peerage in 1978, he took it. Naturally, he chose for himself the title Baron Young of Dartington, honouring the institution he had served as a trustee since the age of 27. As you would expect, he used the opportunity to speak about the issues that moved him in the upper house of the British parliament. But there is a further, final irony. A major reason he had accepted the title (“guardedly”, as he told his friends) was that he was having difficulties meeting the expense of travelling up to London from his home in the country. Members of the Lords not only got a daily allowance if they attended the house; they got a pass to travel free on the railways. Michael Young entered the aristocracy because he needed the money.

Saturday, 15 September 2018

The myth of freedom

Yuval Noah Harari in The Guardian


Should scholars serve the truth, even at the cost of social harmony? Should you expose a fiction even if that fiction sustains the social order? In writing my latest book, 21 Lessons for the 21st Century, I had to struggle with this dilemma with regard to liberalism.

On the one hand, I believe that the liberal story is flawed, that it does not tell the truth about humanity, and that in order to survive and flourish in the 21st century we need to go beyond it. On the other hand, at present the liberal story is still fundamental to the functioning of the global order. What’s more, liberalism is now attacked by religious and nationalist fanatics who believe in nostalgic fantasies that are far more dangerous and harmful. 

So should I speak my mind openly, risking that my words could be taken out of context and used by demagogues and autocrats to further attack the liberal order? Or should I censor myself? It is a mark of illiberal regimes that they make free speech more difficult even outside their borders. Due to the spread of such regimes, it is becoming increasingly dangerous to think critically about the future of our species.

I eventually chose free discussion over self-censorship, thanks to my belief both in the strength of liberal democracy and in the necessity to revamp it. Liberalism’s great advantage over other ideologies is that it is flexible and undogmatic. It can sustain criticism better than any other social order. Indeed, it is the only social order that allows people to question even its own foundations. Liberalism has already survived three big crises – the first world war, the fascist challenge in the 1930s, and the communist challenge in the 1950s-70s. If you think liberalism is in trouble now, just remember how much worse things were in 1918, 1938 or 1968.


The main challenge liberalism faces today comes not from fascism or communism but from the laboratories


In 1968, liberal democracies seemed to be an endangered species, and even within their own borders they were rocked by riots, assassinations, terrorist attacks and fierce ideological battles. If you happened to be amid the riots in Washington on the day after Martin Luther King was assassinated, or in Paris in May 1968, or at the Democratic party’s convention in Chicago in August 1968, you might well have thought that the end was near. While Washington, Paris and Chicago were descending into chaos, Moscow and Leningrad were tranquil, and the Soviet system seemed destined to endure for ever. Yet 20 years later it was the Soviet system that collapsed. The clashes of the 1960s strengthened liberal democracy, while the stifling climate in the Soviet bloc presaged its demise.

So we hope liberalism can reinvent itself yet again. But the main challenge it faces today comes not from fascism or communism, and not even from the demagogues and autocrats that are spreading everywhere like frogs after the rains. This time the main challenge emerges from the laboratories.

Liberalism is founded on the belief in human liberty. Unlike rats and monkeys, human beings are supposed to have “free will”. This is what makes human feelings and human choices the ultimate moral and political authority in the world. Liberalism tells us that the voter knows best, that the customer is always right, and that we should think for ourselves and follow our hearts.



Unfortunately, “free will” isn’t a scientific reality. It is a myth inherited from Christian theology. Theologians developed the idea of “free will” to explain why God is right to punish sinners for their bad choices and reward saints for their good choices. If our choices aren’t made freely, why should God punish or reward us for them? According to the theologians, it is reasonable for God to do so, because our choices reflect the free will of our eternal souls, which are independent of all physical and biological constraints.

This myth has little to do with what science now teaches us about Homo sapiens and other animals. Humans certainly have a will – but it isn’t free. You cannot decide what desires you have. You don’t decide to be introvert or extrovert, easy-going or anxious, gay or straight. Humans make choices – but they are never independent choices. Every choice depends on a lot of biological, social and personal conditions that you cannot determine for yourself. I can choose what to eat, whom to marry and whom to vote for, but these choices are determined in part by my genes, my biochemistry, my gender, my family background, my national culture, etc – and I didn’t choose which genes or family to have.

 
Hacked … biometric sensors could allow corporations direct access to your inner world. Photograph: Alamy Stock Photo

This is not abstract theory. You can witness this easily. Just observe the next thought that pops up in your mind. Where did it come from? Did you freely choose to think it? Obviously not. If you carefully observe your own mind, you come to realise that you have little control of what’s going on there, and you are not choosing freely what to think, what to feel, and what to want.

Though “free will” was always a myth, in previous centuries it was a helpful one. It emboldened people who had to fight against the Inquisition, the divine right of kings, the KGB and the KKK. The myth also carried few costs. In 1776 or 1945 there was relatively little harm in believing that your feelings and choices were the product of some “free will” rather than the result of biochemistry and neurology.

But now the belief in “free will” suddenly becomes dangerous. If governments and corporations succeed in hacking the human animal, the easiest people to manipulate will be those who believe in free will.

In order to successfully hack humans, you need two things: a good understanding of biology, and a lot of computing power. The Inquisition and the KGB lacked this knowledge and power. But soon, corporations and governments might have both, and once they can hack you, they can not only predict your choices, but also reengineer your feelings. To do so, corporations and governments will not need to know you perfectly. That is impossible. They will just have to know you a little better than you know yourself. And that is not impossible, because most people don’t know themselves very well.

If you believe in the traditional liberal story, you will be tempted simply to dismiss this challenge. “No, it will never happen. Nobody will ever manage to hack the human spirit, because there is something there that goes far beyond genes, neurons and algorithms. Nobody could successfully predict and manipulate my choices, because my choices reflect my free will.” Unfortunately, dismissing the challenge won’t make it go away. It will just make you more vulnerable to it.

It starts with simple things. As you surf the internet, a headline catches your eye: “Immigrant gang rapes local women”. You click on it. At exactly the same moment, your neighbour is surfing the internet too, and a different headline catches her eye: “Trump prepares nuclear strike on Iran”. She clicks on it. Both headlines are fake news stories, generated perhaps by Russian trolls, or by a website keen on increasing traffic to boost its ad revenues. Both you and your neighbour feel that you clicked on these headlines out of your free will. But in fact you have been hacked.


If governments succeed in hacking the human animal, the easiest people to manipulate will be those who believe in free will

Propaganda and manipulation are nothing new, of course. But whereas in the past they worked like carpet bombing, now they are becoming precision-guided munitions. When Hitler gave a speech on the radio, he aimed at the lowest common denominator, because he couldn’t tailor his message to the unique weaknesses of individual brains. Now it has become possible to do exactly that. An algorithm can tell that you already have a bias against immigrants, while your neighbour already dislikes Trump, which is why you see one headline while your neighbour sees an altogether different one. In recent years some of the smartest people in the world have worked on hacking the human brain in order to make you click on ads and sell you stuff. Now these methods are being used to sell you politicians and ideologies, too.

And this is just the beginning. At present, the hackers rely on analysing signals and actions in the outside world: the products you buy, the places you visit, the words you search for online. Yet within a few years biometric sensors could give hackers direct access to your inner world, and they could observe what’s going on inside your heart. Not the metaphorical heart beloved by liberal fantasies, but rather the muscular pump that regulates your blood pressure and much of your brain activity. The hackers could then correlate your heart rate with your credit card data, and your blood pressure with your search history. What would the Inquisition and the KGB have done with biometric bracelets that constantly monitor your moods and affections? Stay tuned.

Liberalism has developed an impressive arsenal of arguments and institutions to defend individual freedoms against external attacks from oppressive governments and bigoted religions, but it is unprepared for a situation when individual freedom is subverted from within, and when the very concepts of “individual” and “freedom” no longer make much sense. In order to survive and prosper in the 21st century, we need to leave behind the naive view of humans as free individuals – a view inherited from Christian theology as much as from the modern Enlightenment – and come to terms with what humans really are: hackable animals. We need to know ourselves better. 

Of course, this is hardly new advice. From ancient times, sages and saints repeatedly advised people to “know thyself”. Yet in the days of Socrates, the Buddha and Confucius, you didn’t have real competition. If you neglected to know yourself, you were still a black box to the rest of humanity. In contrast, you now have competition. As you read these lines, governments and corporations are striving to hack you. If they get to know you better than you know yourself, they can then sell you anything they want – be it a product or a politician.

It is particularly important to get to know your weaknesses. They are the main tools of those who try to hack you. Computers are hacked through pre-existing faulty code lines. Humans are hacked through pre-existing fears, hatreds, biases and cravings. Hackers cannot create fear or hatred out of nothing. But when they discover what people already fear and hate it is easy to push the relevant emotional buttons and provoke even greater fury.

If people cannot get to know themselves by their own efforts, perhaps the same technology the hackers use can be turned around and serve to protect us. Just as your computer has an antivirus program that screens for malware, maybe we need an antivirus for the brain. Your AI sidekick will learn by experience that you have a particular weakness – whether for funny cat videos or for infuriating Trump stories – and would block them on your behalf.


You feel that you clicked on these headlines out of your free will, but in fact you have been hacked. Photograph: Getty images

But all this is really just a side issue. If humans are hackable animals, and if our choices and opinions don’t reflect our free will, what should the point of politics be? For 300 years, liberal ideals inspired a political project that aimed to give as many individuals as possible the ability to pursue their dreams and fulfil their desires. We are now closer than ever to realising this aim – but we are also closer than ever to realising that this has all been based on an illusion. The very same technologies that we have invented to help individuals pursue their dreams also make it possible to re-engineer those dreams. So how can I trust any of my dreams?

From one perspective, this discovery gives humans an entirely new kind of freedom. Previously, we identified very strongly with our desires, and sought the freedom to realise them. Whenever any thought appeared in the mind, we rushed to do its bidding. We spent our days running around like crazy, carried by a furious rollercoaster of thoughts, feelings and desires, which we mistakenly believed represented our free will. What happens if we stop identifying with this rollercoaster? What happens when we carefully observe the next thought that pops up in our mind and ask: “Where did that come from?”

For starters, realising that our thoughts and desires don’t reflect our free will can help us become less obsessive about them. If I see myself as an entirely free agent, choosing my desires in complete independence from the world, it creates a barrier between me and all other entities. I don’t really need any of those other entities – I am independent. It simultaneously bestows enormous importance on my every whim – after all, I chose this particular desire out of all possible desires in the universe. Once we give so much importance to our desires, we naturally try to control and shape the whole world according to them. We wage wars, cut down forests and unbalance the entire ecosystem in pursuit of our whims. But if we understood that our desires are not the outcome of free choice, we would hopefully be less preoccupied with them, and would also feel more connected to the rest of the world.


If we understood that our desires are not the outcome of free choice, we would hopefully be less preoccupied with them

People sometimes imagine that if we renounce our belief in “free will”, we will become completely apathetic, and just curl up in some corner and starve to death. In fact, renouncing this illusion can have two opposite effects: first, it can create a far stronger link with the rest of the world, and make you more attentive to your environment and to the needs and wishes of others. It is like when you have a conversation with someone. If you focus on what you want to say, you hardly really listen. You just wait for the opportunity to give the other person a piece of your mind. But when you put your own thoughts aside, you can suddenly hear other people.

Second, renouncing the myth of free will can kindle a profound curiosity. If you strongly identify with the thoughts and desires that emerge in your mind, you don’t need to make much effort to get to know yourself. You think you already know exactly who you are. But once you realise “Hi, this isn’t me. This is just some changing biochemical phenomenon!” then you also realise you have no idea who – or what – you actually are. This can be the beginning of the most exciting journey of discovery any human can undertake.



There is nothing new about doubting free will or about exploring the true nature of humanity. We humans have had this discussion a thousand times before. But we never had the technology before. And the technology changes everything. Ancient problems of philosophy are now becoming practical problems of engineering and politics. And while philosophers are very patient people – they can argue about something inconclusively for 3,000 years – engineers are far less patient. Politicians are the least patient of all.

How does liberal democracy function in an era when governments and corporations can hack humans? What’s left of the beliefs that “the voter knows best” and “the customer is always right”? How do you live when you realise that you are a hackable animal, that your heart might be a government agent, that your amygdala might be working for Putin, and that the next thought that emerges in your mind might well be the result of some algorithm that knows you better than you know yourself? These are the most interesting questions humanity now faces.

Unfortunately, these are not the questions most humans ask. Instead of exploring what awaits us beyond the illusion of “free will”, people all over the world are now retreating to find shelter with even older illusions. Instead of confronting the challenge of AI and bioengineering, many are turning to religious and nationalist fantasies that are even less in touch with the scientific realities of our time than liberalism. Instead of fresh political models, what’s on offer are repackaged leftovers from the 20th century or even the middle ages.

When you try to engage with these nostalgic fantasies, you find yourself debating such thingsas the veracity of the Bible and the sanctity of the nation (especially if you happen, like me, to live in a place like Israel). As a scholar, this is a disappointment. Arguing about the Bible was hot stuff in the age of Voltaire, and debating the merits of nationalism was cutting-edge philosophy a century ago – but in 2018 it seems a terrible waste of time. AI and bioengineering are about to change the course of evolution itself, and we have just a few decades to figure out what to do with them. I don’t know where the answers will come from, but they are definitely not coming from a collection of stories written thousands of years ago.

So what to do? We need to fight on two fronts simultaneously. We should defend liberal democracy, not only because it has proved to be a more benign form of government than any of its alternatives, but also because it places the fewest limitations on debating the future of humanity. At the same time, we need to question the traditional assumptions of liberalism, and develop a new political project that is better in line with the scientific realities and technological powers of the 21st century.

Greek mythology tells that Zeus and Poseidon, two of the greatest gods, competed for the hand of the goddess Thetis. But when they heard the prophecy that Thetis would bear a son more powerful than his father, both withdrew in alarm. Since gods plan on sticking around for ever, they don’t want a more powerful offspring to compete with them. So Thetis married a mortal, King Peleus, and gave birth to Achilles. Mortals do like their children to outshine them. This myth might teach us something important. Autocrats who plan to rule in perpetuity don’t like to encourage the birth of ideas that might displace them. But liberal democracies inspire the creation of new visions, even at the price of questioning their own foundations.