Search This Blog

Showing posts with label thinking. Show all posts
Showing posts with label thinking. Show all posts

Thursday 28 December 2017

I used to think people made rational decisions. But now I know I was wrong

Deborah Orr in The Guardian

It’s been coming on for a while, so I can’t claim any eureka moment. But something did crystallise this year. What I changed my mind about was people. More specifically, I realised that people cannot be relied upon to make rational choices. We would have fixed global warming by now if we were rational. Instead, there’s a stubborn refusal to let go of the idea that environmental degradation is a “debate” in which there are “two sides”.

Debating is a great way of exploring an issue when there is real room for doubt and nuance. But when a conclusion can be reached simply by assembling a mountain of known facts, debating is just a means of pitting the rational against the irrational. 

Humans like to think we are rational. Some of us are more rational than others. But, essentially, we are all slaves to our feelings and emotions. The trick is to realise this, and be sure to guard against it. It’s something that, in the modern world, we are not good at. Authentic emotions are valued over dry, dull authentic evidence at every turn.

I think that as individuality has become fetishised, our belief in our right to make half-formed snap judgments, based on little more than how we feel, has become problematically unchallengeable. When Uma Thurman declared that she would wait for her anger to abate before she spoke about Harvey Weinstein, it was, I believe, in recognition of this tendency to speak first and think later.

Good for her. The value of calm reasoning is not something that one sees acknowledged very often at the moment. Often, the feelings and emotions that form the basis of important views aren’t so very fine. Sometimes humans understand and control their emotions so little that they sooner or later coagulate into a roiling soup of anxiety, fear, sadness, self-loathing, resentment and anger which expresses itself however it can, finding objects to project its hurt and confusion on to. Like immigrants. Or transsexuals. Or liberals. Or Tories. Or women. Or men.

Even if the desire to find living, breathing scapegoats is resisted, untrammelled emotion can result in unwise and self-defeating decisions, devoid of any rationality. Rationality is a tool we have created to govern our emotions. That’s why education, knowledge, information is the cornerstone of democracy. And that’s why despots love ignorance.

Sometimes we can identify and harness the emotions we need to get us through the thing we know, rationally, that we have to do. It’s great when you’re in the zone. Even negative emotions can be used rationally. I, for example, use anger a lot in my work. I’m writing on it at this moment, just as much as I’m writing on a computer. I’ll stop in a moment. I’ll reach for facts to calm myself. I’ll reach for facts to make my emotions seem rational. Or maybe that’s just me. Whatever that means.


‘‘Consciousness’ involves no executive or causal relationship with any of the psychological processes attributed to it'David Oakley and Peter Halligan


It’s a fact that I can find some facts to back up my feelings about people. Just writing that down helps me to feel secure and in control. The irrationality of humans has been considered a fact since the 1970s, when two psychologists, Amos Tversky and Daniel Kahneman, showed that human decisions were often completely irrational, not at all in their own interests and based on “cognitive biases”. Their ideas were a big deal, and also formed the basis of Michael Lewis’s book, The Undoing Project.

More recent research – or more recent theory, to be precise – has rendered even Tversky and Kahneman’s ideas about the unreliability of the human mind overly rational.

Chasing the Rainbow: The Non-Conscious Nature of Being is a research paper from University College London and Cardiff University. Its authors, David Oakley and Peter Halligan, argue “that ‘consciousness’ contains no top-down control processes and that ‘consciousness’ involves no executive, causal, or controlling relationship with any of the familiar psychological processes conventionally attributed to it”.

Which can only mean that even when we think we’re being rational, we’re not even really thinking. That thing we call thinking – we don’t even know what it really is.

When I started out in journalism, opinion columns weren’t a big thing. Using the word “I’ in journalism was frowned upon. The dispassionate dissemination of facts was the goal to be reached for.

Now so much opinion is published, in print and online, and so many people offer their opinions about the opinions, that people in our government feel comfortable in declaring that experts are overrated, and the president of the United States regularly says that anything he doesn’t like is “fake news”.

So, people. They’re a problem. That’s what I’ve decided. I’m part of a big problem. All I can do now is get my message out there.

Thursday 9 February 2017

How three students caused a global crisis in economics - A review of The Econocracy

Aditya Chakrabortty in The Guardian

In the autumn of 2011, as the world’s financial system lurched from crash to crisis, the authors of this book began, as undergraduates, to study economics. While their lectures took place at the University of Manchester the eurozone was in flames. The students’ first term would last longer than the Greek government. Banks across the west were still on life support. And David Cameron was imposing on Britons year on year of swingeing spending cuts.

Yet the bushfires those teenagers saw raging each night on the news got barely a mention in the seminars they sat through, they say: the biggest economic catastrophe of our times “wasn’t mentioned in our lectures and what we were learning didn’t seem to have any relevance to understanding it”, they write in The Econocracy. “We were memorising and regurgitating abstract economic models for multiple-choice exams.”

Part of this book describes what happened next: how the economic crisis turned into a crisis of economics. It deserves a good account, since the activities of these Manchester students rank among the most startling protest movements of the decade.

After a year of being force-fed irrelevancies, say the students, they formed the Post-Crash Economics Society, with a sympathetic lecturer giving them evening classes on the events and perspectives they weren’t being taught. They lobbied teachers for new modules, and when that didn’t work, they mobilised hundreds of undergraduates to express their disappointment in the influential National Student Survey. The economics department ended up with the lowest score of any at the university: the professors had been told by their pupils that they could do better.

The protests spread to other economics faculties – in Glasgow, Istanbul, Kolkata. Working at speed, students around the world published a joint letter to their professors calling for nothing less than a reformation of their discipline.

Economics has been challenged by would-be reformers before, but never on this scale. What made the difference was the crash of 2008. Students could now argue that their lecturers hadn’t called the biggest economic event of their lifetimes – so their commandments weren’t worth the stone they were carved on. They could also point to the way in which the economic model in the real world was broken and ask why the models they were using had barely changed.

The protests found an attentive audience among fellow undergraduates – the sort who in previous years would have kept their heads down and waited for the “milk round” to deliver an accountancy traineeship, but were now facing the prospect of hiring freezes, moving back home and paying off their giant student debt with poor wages.

I covered this uprising from the outset, and later served as an unpaid trustee for the network now called Rethinking Economics. To me, it has two key features in common with other social movements that sprang up in the aftermath of the banking crash. Like the Occupy protests, it was ultimately about democracy: who gets to have a say, and who gets silenced. It also shared with the student fees protests of 2010 deep discomfort at the state of modern British universities. What are supposed to be forums for speculative thought more often resemble costly finishing schools for the sons of Chinese communist party cadres and the daughters of wealthy Russians.

Much of the post-crash dissent has disintegrated into trace elements. A line can be drawn from Occupy to Bernie Sanders and Black Lives Matter; some of those undergraduates who were kettled by the police in 2010 are now signed-up Corbynistas. But the economics movement remains remarkably intact. Rethinking Economics has grown to 43 student campaigns across 15 countries, from America to China. Some of its alumni went into the civil service, where they have established an Exploring Economics network to push for alternative approaches to economics in policy making. There are evening classes, and then there is this book, which formalises and expands the case first made five years ago.


 Joe Earle, centre, with the Post-Crash Economics Society at Manchester University. Photograph: Jon Super

The Econocracy makes three big arguments. First, economics has shoved its way into all aspects of our public life. Flick through any newspaper and you’ll find it is not enough for mental illness to cause suffering, or for people to enjoy paintings: both must have a specific cost or benefit to GDP. It is as if Gradgrind had set up a boutique consultancy, offering mandatory but spurious quantification for any passing cause.

Second, the economics being pushed is narrow and of recent invention. It sees the economy “as a distinct system that follows a particular, often mechanical logic” and believes this “can be managed using a scientific criteria”. It would not be recognised by Keynes or Marx or Adam Smith.

In the 1930s, economists began describing the economy as a unitary entity. For decades, Treasury officials produced forecasts in English. That changed only in 1961, when they moved to formal equations and reams of numbers. By the end of the 1970s, 99 organisations were generating projections for the UK economy. Forecasting had become a numerical alchemy: turning base human assumptions and frailty into the marketable gold of rigorous-seeming science.
By making their discipline all-pervasive, and pretending it is the physics of social science, economists have turned much of our democracy into a no-go zone for the public. This is the authors’ ultimate charge: “We live in a nation divided between a minority who feel they own the language of economics and a majority who don’t.”

This status quo works well for the powerful and wealthy and it will be fiercely defended. As Ed Miliband and Jeremy Corbyn have found, suggest policies that challenge the narrow orthodoxy and you will be branded an economic illiterate – even if they add up. Academics who follow different schools of economic thought are often exiled from the big faculties and journals.
The most devastating evidence in this book concerns what goes into making an economist. The authors analysed 174 economics modules for seven Russell Group universities, making this the most comprehensive curriculum review I know of. Focusing on the exams that undergraduates were asked to prepare for, they found a heavy reliance on multiple choice. The vast bulk of the questions asked students either to describe a model or theory, or to show how economic events could be explained by them. Rarely were they asked to assess the models themselves. In essence, they were being tested on whether they had memorised the catechism and could recite it under invigilation.

Critical thinking is not necessary to win a top economics degree. Of the core economics papers, only 8% of marks awarded asked for any critical evaluation or independent judgment. At one university, the authors write, 97% of all compulsory modules “entailed no form of critical or independent thinking whatsoever”.

The high priests of economics still hold power, but they no longer have legitimacy

Remember that these students shell out £9,000 a year for what is an elevated form of rote learning. Remember, too, that some of these graduates will go on to work in the City, handle multimillion pound budgets at FTSE businesses, head Whitehall departments, and set policy for the rest of us. Yet, as the authors write: “The people who are entrusted to run our economy are in almost no way taught to think about it critically.”

They aren’t the only ones worried. Soon after Earle and co started at university, the Bank of England held a day-long conference titled Are Economics Graduates Fit for Purpose?. Interviewing Andy Haldane, chief economist at the Bank of England, in 2014, I asked: what was the answer? There was an audible gulp, and a pause that lasted most of a minute. Finally, an answer limped out: “Not yet.”

The Manchester undergraduates were told by an academic that alternative approaches were as much use as a tobacco-smoke enema. Which is to say, he was as likely to take Friedrich Hayek or Joseph Schumpeter seriously as he was to blow smoke up someone’s arse.

The students’ entrepreneurialism is evident in this book. Packed with original research, it comes with pages of endorsements, evidently harvested by the students themselves, from Vince Cable to Noam Chomsky. Yet the text is rarely angry. Its tone is of a strained politeness, as if the authors were talking politics with a putative father-in-law.

More thoughtful academics have accepted the need for change – but strictly on their own terms, within the limits only they decide. That professional defensiveness has done them no favours. When Michael Gove compared economists to the scientists who worked for Nazi Germany and declared the “people of this country have had enough of experts”, he was shamelessly courting a certain type of Brexiter. But that he felt able to say it at all says a lot about how low the standing of economists has sunk.

The high priests of economics still hold power, but they no longer have legitimacy. In proving so resistant to serious reform, they have sent the message to a sceptical public that they are unreformable. Which makes The Econocracy a case study for the question we should all be asking since the crash: how, after all that, have the elites – in Westminster, in the City, in economics – stayed in charge?

The Econocracy is published by Manchester University.

Wednesday 4 January 2017

Thinking in Stories

The so-called post-truth society is not primarily the result of our inability to focus on facts; it is due to our failure to read stories deeply

Tabish Khair in The Hindu

Say the word ‘thinking’, and the image evoked is that of abstract ideas, facts, numbers and data. But what if I say that this is our first and most common error about the nature of thinking? As religions have always known, human thinking is conducted primarily in stories, not facts or numbers.

Human beings might be the only living animals that can think in stories. Facts and information of some sort exist for a deer and a wolf too, but fiction, and thinking in fiction?

Now, stories are celebrated for many things: as repositories of folk knowledge or accumulated wisdom, as relief from the human condition, as entertainment, as enabling some cognitivist processes, even as the best way to get yourself and your children to fall asleep! But all this misses the main point about stories: they are the most common, most pervasive, and probably the oldest way for humans to think. 


Problem of a fundamentalist reading

Having missed this point, we then proceed to reduce stories — and their most complex enunciation, literature — to much less than what they are or should be. For instance, a good story is not just a narrative. It does not simply take us from point A to point Z, with perhaps an easy moral appended. Religious fundamentalists who see stories only in those terms end up destroying the essence of their religions.

Let us take one example: the Book of Job. The fundamentalist reading of the Book of Job stresses Job’s faith. In this version, the story is simple: Job is a prosperous, God-fearing man, and God is very proud of him. Satan, however, argues that Job is such a good man only because God has been kind to him. Give him adversity and you will see his faith waver, says Satan. God allows Satan to test Job, by depriving him of prosperity, family, health. But Job’s faith does not waver, and finally all is restored to him. The fundamentalist reading — which reduces the story to a narrative — is simple: this is a parable about true faith.

To leave the Book of Job there is to stop thinking about it. Because the narrative of Job is secondary to its problematic. One can even argue that the narrative is misleading: in the restoration of Job’s children, health and wealth, we have a resolution that fails in our terms. We do not expect such miracles in real life. Hence, it is not the narrative of Job that is significant.

What is significant and useful are the problems of the story. For instance, when the righteous, believing Job is afflicted with death and suffering, such questions are raised (in the story and by Job’s friends): Who is to be blamed? Is God unjust or uncaring? Has Job sinned in hiding (or ignorance) and is therefore being punished? Does it all make any sense?

Job adopts a difficult position throughout the story: among other things, he neither blames God, nor does he blame himself, but he demands an answer. When one thinks of this, one comes to the kernel of the thought of this story: how does one live best in a world where undeserved suffering sometimes befalls the good? It is not the unbelievable narrative which makes this a significant story; it is the way Job’s reactions, his friends’ prescriptions and the problematic of the entire story make us think. Moreover, as God’s incomplete ‘answers’ to Job indicate, stories can make us think in very complex ways.

Religions have always known that human beings think best and most easily in stories. That is why religions consciously think through stories: the ‘facts’ and ‘details’ of these stories change with changing human circumstances, but what does not change is the bid and ability to make us contemplate, imagine, reason, induce, examine — in other words, think.

Strangely, politicians have also known this. All major political movements have depended on the power of stories. In the decades when the Left was on the ascendency, it had a powerful story to tell — of human exploitation, human resistance and eventually human achievement in the shape of a ‘classless’ society. In recent years, the Right has managed to tell us stories that, for various reasons, seem more convincing to many: inevitable state-aided neo-liberalism, for instance. Narendra Modi’s victory in India, Recep Tayyip Erdogan’s in Turkey, and Donald Trump’s in the U.S. — all three are driven by powerful narratives that explain the ‘past’ and promise a ‘future.’

Failure of academics

Unfortunately, the one area where thinking in stories was taken seriously — and not just reduced to mechanistic explanations — has lost confidence in itself. The Humanities have been too busy trying to justify stories in all possible terms — entertainment, discourse, narratology, cognitivist structures, reader response, etc. — instead of working on how to best think in stories. The total failure of academics, publishers and editors to talk of literature as literature — not just what sells, or a set of ‘reader responses’, or a soporific, or passing politics, or ageless ‘Darwinism,’ etc. — is an index of this failure.

The so-called post-truth society is not primarily the result of our inability to focus on facts; it is due to our failure to read stories deeply. Just as there are ways in which facts can be used positively or negatively, there are ways in which stories can be read — to make us think or to prevent us from thinking. Literature — even in the days when it was written with a capital ‘L’ — was the one area of the Humanities where this was a serious endeavour. This has changed at great cost to human civilisation.

Humans still think primarily in stories. But the failure of standards in education and literary criticism has combined with the rise of fundamentalism (which is not piety or religious thought), scientism (which is not science) and numerical neo-liberalism (which is not even capitalism) to deprive more and more people of the ability to think critically, deeply and sensitively in stories. This explains many of our current political and economic woes.

Tuesday 27 December 2016

What is Strategic Thinking

Ron Carucci in Harvard Business Review


It’s a common complaint among top executives: “I’m spending all my time managing trivial and tactical problems, and I don’t have time to get to the big-picture stuff.” And yet when I ask my executive clients, “If I cleared your calendar for an entire day to free you up to be ‘more strategic,’ what would you actually do?” most have no idea. I often get a shrug and a blank stare in response. Some people assume that thinking strategically is a function of thinking up “big thoughts” or reading scholarly research on business trends. Others assume that watching TED talks or lectures by futurists will help them think more strategically.

How can we implement strategic thinking if we’re not even sure what it looks like?

In our 10-year longitudinal study of over 2,700 newly appointed executives, 67% of them said they struggled with letting go of work from previous roles. More than half (58%) said they were expected to know details about work and projects they believed were beneath their level, and more than half also felt they were involved in decisions that those below them should be making. This suggests that the problem of too little strategic leadership may be as much a function of doing as of thinking.

Rich Horwath, CEO of the Strategic Thinking Institute, found in his research that 44% of managers spent most of their time firefighting in cultures that rewarded reactivity and discouraged thoughtfulness. Nearly all leaders (96%) claimed they lacked time for strategic thinking, again, because they were too busy putting out fires. Both issues appear to be symptoms masking a fundamental issue. In my experience helping executives succeed at the top of companies, the best content for great strategic thinking comes right from one’s own job.

Here are three practical ways I’ve helped executives shift their roles to assume the appropriate strategic focus required by their jobs.

Identify the strategic requirements of your job. One chief operating officer I worked with was appointed to her newly created role with the expressed purpose of integrating two supply chain organizations resulting from an acquisition. Having risen through the supply chain ranks, she spent most of her time reacting to operational missteps and customer complaints. Her adept problem-solving skill had trained the organization to look to her for quick decisions to resolve issues. I asked her, “What’s the most important thing your CEO and board want you to accomplish in this role?” She answered readily, “To take out duplicate costs from redundant work and to get the organization on one technology platform to manage our supply chain.” Her succinct clarity surprised even her, though she quickly realized how little she was engaged in activities that would reach that outcome. We broke the mandate into four focus areas for her organization, realigned her team to include leaders from both organizations, and ensured all meetings and decisions she was involved in directly connected to her mandate.
 



Unfortunately, for many executives, the connection between their role and the strategic contribution they should make is not so obvious. As quoted in Horwath’s study, Harvard Business School professor David Collis says, “It’s a dirty little secret: Most executives cannot articulate the objective, scope, and advantage of their business in a simple statement. If they can’t, neither can anyone else.” He also cites Roger Martin’s research, which found that 43% of managers cannot state their own strategy. Executives with less clarity must work harder to etch out the line of sight between their role and its impact on the organization’s direction. In some cases, shedding the collection of bad habits that have consumed how they embody their role will be their greatest challenge to embodying strategic thinking.

Uncover patterns to focus resource investments. Once a clear line of sight is drawn to a leader’s strategic contribution, resources must be aligned to focus on that contribution. For many new executives, the large pile of resources they now get to direct has far greater consequence than anything they’ve allocated before. Aligning budgets and bodies around a unified direction is much harder when there’s more of them, especially when reactionary decision making has become the norm. Too often, immediate crises cause executives to whiplash people and money.

This is a common symptom of missing insights. Without a sound fact and insight base on which to prioritize resources, squeaky wheels get all the grease. Great strategic executives know how to use data to generate new insights about how they and their industries make money. Examining patterns of performance over time — financial, operational, customer, and competitive data — will reveal critical foresight about future opportunities and risks.

For some, the word insight may conjure up notions of breakthrough ideas or “aha moments.” But studying basic patterns within available data gives simple insights that pinpoint what truly sets a company apart. In the case of the supply chain executive above, rather than a blanket cost reduction, she uncovered patterns within her data that identified and protected the most competitive work of her organization: getting products to customers on time and accurately. She isolated those activities from work that added little value or was redundant, which is where she focused her cost-cutting efforts. She was able to dramatically reduce costs while improving the customer’s experience.

Such focus helps leaders allocate money and people with confidence. They know they are working on the right things without reacting to impulsive ideas or distracting minutia.


Invite dissent to build others’ commitment. Strategic insight is as much a social capability as it is an intellectual one. No executive’s strategic brilliance will ever be acted upon alone. An executive needs those she leads to translate strategic insights into choices that drive results. For people to commit to carrying out an executive’s strategic thinking, they have to both understand and believe in it.

That’s far more difficult than it sounds. One study found that only 14% of people understood their company’s strategy and only 24% felt the strategy was linked to their individual accountabilities. Most executives mistakenly assume that repeated explanations through dense PowerPoint presentations are what increases understanding and ownership of strategy.

To the contrary, people’s depth of commitment increases when they, not their leader, are talking. One executive I work with habitually takes his strategic insights to his team and intentionally asks for dueling fact bases to both support and refute his thinking. As the debate unfolds, flawed assumptions are surfaced and replaced with shared understanding, ideas are refined, and ownership for success spreads.

Sound strategic thinking doesn’t have to remain an abstract mystery only a few are able to realize. Despite the common complaint, it’s not the result of making time for it. Executives must extract themselves from day-to-day problems and do the work that aligns their job with the company’s strategy. They need to be armed with insights that predict where best to focus resources. And they need to build a coalition of support by inviting those who must execute to disagree with and improve their strategic thinking. Taking these three practical steps will raise the altitude of executives to the appropriate strategic work of the future, freeing those they lead to direct the operational activities of today.

Tuesday 1 December 2015

Why blame culture is toxic for sport

Ed Smith in Cricinfo

Is ranting at players during team talks like bloodletting in the age of quack doctors?


Shouting at players: Satisfying? Yes. Effective? No © AFP



The subversive in me would love to whitewash over the usual clichés and catchphrases that are splashed on dressing-room walls and replace them with a more cynical message:

The six phases of a project:

1. Enthusiasm
2. Disillusionment
3. Panic
4. Search for the guilty
5. Punishment of the innocent
6. Rewards for the uninvolved

Not very cheering, I admit, but a salutary warning about our obsession with blame - a preoccupation sustained by dodgy narratives about "causes" that leads not to institutional improvement but to self-serving politics. Having pinned the blame on someone - rightly or, more likely, wrongly - the next task is "moving on". Sound familiar?

The "six phases" were attached to an office wall by an employee at the Republic Bank of New York. The story appears in Black Box Thinking, Matthew Syed's new book. Syed (a leading sports columnist and double Olympian) argues that our preoccupation with convenient blame - rather than openness to learning from failure - is a central factor holding teams and individuals back from improving. I think he is right.

Syed expresses admiration for the airline industry and its commitment to learning from failure - especially from "black boxes", the explosion-proof devices that record the conversations of pilots and other data. If the plane's wreckage is found, lessons - no matter how painful - must be learned. In the jargon, learning inside the aviation industry is an "open loop". (An "open loop" leads to progress because the feedback is rationally acted on; a "closed loop" is where failure doesn't lead to progress because weaknesses are ignored or errors are misinterpreted.) Syed presents harrowing examples from hospital operating theatres, of "closed loops" costing lives. Indeed, with its recurrent plane crashes and botched operations, the book takes the search for transferrable lessons to harrowing extremes.

One question prompted by Black Box Thinking is why is sport is not instinctively enthusiastic about evidence-based discussion. You might think that sports teams would be so keen to improve that they would rush to expose their ideas to rational and reflective scrutiny. But that's not always the case. As a player I often felt that insecure teams shrank from critical thinking, where more confident teams encouraged it.

The first problem sport has with critical thinking is the "narrative fallacy" (a concept popularised by Nassim Taleb). Consider this statement, thrown at me by a coach as I left the dressing room and walked onto the field after winning the toss and deciding to bowl first: "We need to have them five wickets down at lunch to justify the decision."

Hmm. First, even thinking about "justifying" a decision is an unnecessary distraction. Secondly, it's also irrational to think that the fact of taking five wickets, even if it happens, proves the decision was right. I might have misread the wicket, which actually suited batting first, but the opposition might have suffered a bad morning - five wickets could fall and yet the decision could still easily be wrong.

Alternatively, the wicket might suit bowling - and hence "justify" my decision - but we might bowl improbably badly and drop our catches. In other words, it could be the right decision even if they are no wickets down at lunch. What happened after the decision (especially when the sample of evidence is small or, as in this instance, solitary) does not automatically prove the rightness or wrongness of the decision.

Fancy theorising? Prefer practical realities? This kind of theorising, in fact, is bound up with very practical realities. Consider this example.

For much of medical history, bloodletting was a common and highly respected procedure. When a patient was suffering from a serious ailment and went to a leading doctor, the medical guru promptly drained significant amounts of blood from an already weak body. Madness? It happened for centuries.

And sometimes, if we don't think critically, it "works". As Syed points out, in a group of ten patients treated with bloodletting, five might die and five get better. So it worked for the five who survived, right?

Only, it didn't, of course. The five who were healed would have got better anyway (the body has great powers of self-recuperation). And some among the five who died were pushed from survival into death. Proving this fact, however, was more difficult - especially in a medical culture dominated by doctors who advocated and profited materially from bloodletting.

The challenge of demonstrating the real usefulness (or otherwise) of a procedure led to the concept of the "control group". Now imagine a group of 20 patients with serious illnesses - and split them into two groups, ten in each group. One group of ten patients gets a course of bloodletting, the other group of ten (the control group) does not. If we discover that five out of ten died in the bloodletting group and only three out of ten among the non-bloodletting group, then, at last, we have the beginnings of a proper evidence-based approach. The intervention (bloodletting) did more harm than simply doing nothing. It was iatrogenic.

Iatrogenic interventions are common in sport, too - such as when the coach tells a batsman to change his lifelong grip before making his Test debut. (Impossible? Exactly that happened to a friend of mine.) The angry team meeting is a classic iatrogenic intervention. Shouting at the team and vindictively blaming individual players, like bloodletting, provides the coach with the satisfying illusion that it works well sometimes. By "it works", we imply that the team in question played better after half-time or the following morning. Even having suffered an iatrogenic intervention, however, some teams - like some patients enduring bloodletting - inevitably play better afterwards. But on average, all taken together, teams would have playedbetter still without the distraction of a raging coach. (This insight helped win Daniel Kahneman a Nobel Prize, as I learned when I interviewed him.)

The great difficulty of sport, of course, is the challenge of conducting a proper control group experiment - because the game situation, pressures and circumstances are seldom exactly the same twice over. However, merely being open to the logic of these ideas, constantly exposing judgements and intuitions to critical thinking, takes decision-makers a good step in the direction of avoiding huge errors of conventional thinking.

That is why much of what Syed calls "black box thinking" could, I think, be filed under "critical thinking" - the desire to refine and improve one's system of thought as you are exposed to new experiences and ideas. Here is a personal rule of thumb: critical thinkers are also the best company over the long term. Critical thinkers are not only better bets professionally, they are also more interesting friends. Who wants to listen to the same set of unexamined views and sacrosanct opinions for decades? If you believe that your ideas don't ever need to evolve and adapt, can we at least skip dinner?

It is hard to imagine how anyone who is interested in leadership, innovation or self-improvement could fail to find something new and challenging in this book. Rather than presenting a simplistic catch-all solution, Syed takes us on a modern and personal walk through the scientific method. The book makes an interesting contrast with Syed's first book,Bounce, which proposed that talent is a myth - an argument that can be summed up in a single, seductive phrase: genius is a question of practice.

Rather than presenting a single idea, Black Box Thinking circles around a main theme - illustrating and illuminating it by drawing on a dizzyingly wide and eclectic series of ideas, case studies and lines of philosophical enquiry. The reader finishes the book with a deeper understanding of how he might improve and grow over the long term, rather than the transient feeling of having all his problems solved. The author, we sense, has experienced a similar journey while writing the book. Syed doesn't just preach black-box thinking, he practises it.

Wednesday 2 July 2014

Why we need a Truth on the Clothes Label Act


We know more about the conditions of our battery hens than of our battery textile workers. A year after Rana Plaza, it's time we were given the facts
Daniel Pudles illustration for Clothes Label Act story
'Forcing businesses to admit exactly who is responsible for their economic success, and who reaps the profits, is a good start'. Illustration: Daniel Pudles
Somewhere in Swansea is a woman whose hand I want to shake. My guess is that she's the one responsible for giving Primark such a stonking headache over the past few days. You probably know her handiwork – at least, you will if you saw the stories about how two Swansea shoppers came back from the local Primark with bargain dresses mysteriously bearing extra labels. One read "'Degrading' sweatshop conditions"; another "Forced to work exhausting hours".
How did they get there? "Cries for help" from a production line in deepest Dhaka, claim the merchants of journalese. But surely no machinist could bunk off their punishing workload to script these complaints in pristine English, stitch them in and whisk them past a pin-sharp inspector. The much-more-likely scenario is an activist, holed up in a south Wales fitting room, hastily darning her protests.
In which case: well-needled, that woman. Not only has she gummed up the Primark publicity machine for days on end and brought back into discussion the costs of cheap fashion, she's also given pause to two shoppers. In the words of one: "I've never really thought much about how the clothes are made … I dread to think that my summer top may be made by some exhausted person toiling away for hours in some sweatshop."
In a mall, such thinking counts as disruptive activity. The lexicon for most retailers runs from impulse buy to splurge to treat; they prefer us to wander the aisles with our eyes wide open and our minds shut tight. The whole point of a shopping environment is to drown out those inconvenient headlines about dead textile workers in Rana Plaza with a bit of Ellie Goulding and a lot of advertising. Which is what makes the Primark protests, or the Tesco shelfie campaign, or the UK Uncut rallies so splendidly aggravating – because they undercut the multimillion-pound marketing with point of sale information about poverty pay for shop staff or high-street tax dodging.
They also underline how little we're told about what we're paying for. Look at the label sewn into your top: the only thing it must tell you under law is which fibres it's made out of – whether it's cotton or acrylic or whatever. Which country your shirt came from, or the accuracy of the sizing – such essentials are in the gift of the retailer. A similarly light-touch regime holds for food: after years of fighting between consumer groups and the (now eviscerated) Food Standards Agency, and big-spending food manufacturers, a new set of traffic-light labels will be introduced. Thanks to heavy industrial lobbying, it will still be completely voluntary.
How much sugar is in your bowl of Frosties: this is a basic fact, yet it remains up to the seller how they present it to you. By law you are entitled to more information about the production of your eggs than your underwear. Under current regulations, we know more about our battery hens than we do about our battery textile workers.

Cry for help label in Primark top A ‘cry for help' label in a top from Primark in Swansea. ‘Big retailers can also display ­prominently how much tax they pay, and what they pay both top staff and shopfloor employees.' Photograph: Matthew Horwood/Wales News Service


Consider: just over a year has passed since the collapse of the Rana Plaza factory, which saw more than 1,100 staff crushed to death and another 2,500 injured, many permanently disabled. Those people and the thousands of others working in similarly precarious and punishing conditions make the garments we wear and the electronic goods we fiddle about with. Yet they rate barely a mention. Outsourcing and globalisation may have brought down the price of our shopping, but it has also enabled retailers to engage in a facade of blame-shifting and plausible deniability: for Apple to pass the buck for suicidal Chinese workers on to Foxconn and duck the questions about how much of a margin it pays suppliers.
So here's a modest proposal: a new law that mandates more, and more relevant information, on the products we buy. Call it the Truth on the Label Act, which will require shops to display where their goods are made, which chemicals were used in production, and whether the factory is unionised. Stick it on the shelves, print it on the clothes tags. Big retailers can also display prominently in each branch how much tax they pay, and what they pay both top staff and shopfloor employees.
That's because while queuing up for the self-service checkout, hungry commuters might want to know that the boss of Tesco's, Philip Clarke, is paid 135 times what his lowest-paid member of staff is. Or that George Weston, chief executive of Primark's parent company, received over £5m last year, while the young women who sew his firm's T-shirts get less than £30 a month.
Such information is not hard for the big retailers to provide. Long before Google was an algorithm in a programmer's eye, Tesco was in the data-collection business. This information in itself won't change an entire economic system. But forcing businesses to admit exactly who is responsible for their economic success, and who reaps the profits, is a good start. Otherwise, we're entirely dependent on activists in changing rooms.

Sunday 19 May 2013

Daniel Dennett's seven tools for thinking



Cognitive scientist and philosopher Daniel Dennett is one of America's foremost thinkers. In this extract from his new book, he reveals some of the lessons life has taught him
dennett
Daniel Dennett: 'Often the word "surely" is as good as a blinking light locating a weak point in the argument.' Photograph: Peter Yang/August

1 USE YOUR MISTAKES

We have all heard the forlorn refrain: "Well, it seemed like a good idea at the time!" This phrase has come to stand for the rueful reflection of an idiot, a sign of stupidity, but in fact we should appreciate it as a pillar of wisdom. Any being, any agent, who can truly say: "Well, it seemed like a good idea at the time!" is standing on the threshold of brilliance. We human beings pride ourselves on our intelligence, and one of its hallmarks is that we can remember our previous thinking and reflect on it – on how it seemed, on why it was tempting in the first place and then about what went wrong.
  1. Intuition Pumps and Other Tools for Thinking
  2. by Daniel C Dennett
  1. Tell us what you think: Rate the book
I know of no evidence to suggest that any other species on the planet can actually think this thought. If they could, they would be almost as smart as we are. So when you make a mistake, you should learn to take a deep breath, grit your teeth and then examine your own recollections of the mistake as ruthlessly and as dispassionately as you can manage. It's not easy. The natural human reaction to making a mistake is embarrassment and anger (we are never angrier than when we are angry at ourselves) and you have to work hard to overcome these emotional reactions.
Try to acquire the weird practice of savouring your mistakes, delighting in uncovering the strange quirks that led you astray. Then, once you have sucked out all the goodness to be gained from having made them, you can cheerfully set them behind you and go on to the next big opportunity. But that is not enough: you should actively seek out opportunities just so you can then recover from them.
In science, you make your mistakes in public. You show them off so that everybody can learn from them. This way, you get the benefit of everybody else's experience, and not just your own idiosyncratic path through the space of mistakes. (Physicist Wolfgang Pauli famously expressed his contempt for the work of a colleague as "not even wrong". A clear falsehood shared with critics is better than vague mush.)
This, by the way, is another reason why we humans are so much smarter than every other species. It is not so much that our brains are bigger or more powerful, or even that we have the knack of reflecting on our own past errors, but that we share the benefits our individual brains have won by their individual histories of trial and error.
I am amazed at how many really smart people don't understand that you can make big mistakes in public and emerge none the worse for it. I know distinguished researchers who will go to preposterous lengths to avoid having to acknowledge that they were wrong about something. Actually, people love it when somebody admits to making a mistake. All kinds of people love pointing out mistakes.
Generous-spirited people appreciate your giving them the opportunity to help, and acknowledging it when they succeed in helping you; mean-spirited people enjoy showing you up. Let them! Either way we all win.

RESPECT YOUR OPPONENT

Just how charitable are you supposed to be when criticising the views of an opponent? If there are obvious contradictions in the opponent's case, then you should point them out, forcefully. If there are somewhat hidden contradictions, you should carefully expose them to view – and then dump on them. But the search for hidden contradictions often crosses the line into nitpicking, sea-lawyering and outright parody. The thrill of the chase and the conviction that your opponent has to be harbouring a confusion somewhere encourages uncharitable interpretation, which gives you an easy target to attack.
But such easy targets are typically irrelevant to the real issues at stake and simply waste everybody's time and patience, even if they give amusement to your supporters. The best antidote I know for this tendency to caricature one's opponent is a list of rules promulgated many years ago by social psychologist and game theorist Anatol Rapoport.
How to compose a successful critical commentary:
1. Attempt to re-express your target's position so clearly, vividly and fairly that your target says: "Thanks, I wish I'd thought of putting it that way."
2. List any points of agreement (especially if they are not matters of general or widespread agreement).
3. Mention anything you have learned from your target.
4. Only then are you permitted to say so much as a word of rebuttal or criticism.
One immediate effect of following these rules is that your targets will be a receptive audience for your criticism: you have already shown that you understand their positions as well as they do, and have demonstrated good judgment (you agree with them on some important matters and have even been persuaded by something they said). Following Rapoport's rules is always, for me, something of a struggle…

THE "SURELY" KLAXON

When you're reading or skimming argumentative essays, especially by philosophers, here is a quick trick that may save you much time and effort, especially in this age of simple searching by computer: look for "surely" in the document and check each occurrence. Not always, not even most of the time, but often the word "surely" is as good as a blinking light locating a weak point in the argument.
Why? Because it marks the very edge of what the author is actually sure about and hopes readers will also be sure about. (If the author were really sure all the readers would agree, it wouldn't be worth mentioning.) Being at the edge, the author has had to make a judgment call about whether or not to attempt to demonstrate the point at issue, or provide evidence for it, and – because life is short – has decided in favour of bald assertion, with the presumably well-grounded anticipation of agreement. Just the sort of place to find an ill-examined "truism" that isn't true!

ANSWER RHETORICAL QUESTIONS

Just as you should keep a sharp eye out for "surely", you should develop a sensitivity for rhetorical questions in any argument or polemic. Why? Because, like the use of "surely", they represent an author's eagerness to take a short cut. A rhetorical question has a question mark at the end, but it is not meant to be answered. That is, the author doesn't bother waiting for you to answer since the answer is so obvious that you'd be embarrassed to say it!
Here is a good habit to develop: whenever you see a rhetorical question, try – silently, to yourself – to give it an unobvious answer. If you find a good one, surprise your interlocutor by answering the question. I remember a Peanuts cartoon from years ago that nicely illustrates the tactic. Charlie Brown had just asked, rhetorically: "Who's to say what is right and wrong here?" and Lucy responded, in the next panel: "I will."

EMPLOY OCCAM'S RAZOR

Attributed to William of Ockham (or Ooccam), a 14th-century English logician and philosopher, this thinking tool is actually a much older rule of thumb. A Latin name for it is lex parsimoniae, the law of parsimony. It is usually put into English as the maxim "Do not multiply entities beyond necessity".
The idea is straightforward: don't concoct a complicated, extravagant theory if you've got a simpler one (containing fewer ingredients, fewer entities) that handles the phenomenon just as well. If exposure to extremely cold air can account for all the symptoms of frostbite, don't postulate unobserved "snow germs" or "Arctic microbes". Kepler's laws explain the orbits of the planets; we have no need to hypothesise pilots guiding the planets from control panels hidden under the surface. This much is uncontroversial, but extensions of the principle have not always met with agreement.
One of the least impressive attempts to apply Occam's razor to a gnarly problem is the claim (and provoked counterclaims) that postulating a God as creator of the universe is simpler, more parsimonious, than the alternatives. How could postulating something supernatural and incomprehensible be parsimonious? It strikes me as the height of extravagance, but perhaps there are clever ways of rebutting that suggestion.
I don't want to argue about it; Occam's razor is, after all, just a rule of thumb, a frequently useful suggestion. The prospect of turning it into a metaphysical principle or fundamental requirement of rationality that could bear the weight of proving or disproving the existence of God in one fell swoop is simply ludicrous. It would be like trying to disprove a theorem of quantum mechanics by showing that it contradicted the axiom "Don't put all your eggs in one basket".

DON'T WASTE YOUR TIME ON RUBBISH

Sturgeon's law is usually expressed thus: 90% of everything is crap. So 90% of experiments in molecular biology, 90% of poetry, 90% of philosophy books, 90% of peer-reviewed articles in mathematics – and so forth – is crap. Is that true? Well, maybe it's an exaggeration, but let's agree that there is a lot of mediocre work done in every field. (Some curmudgeons say it's more like 99%, but let's not get into that game.)
A good moral to draw from this observation is that when you want to criticise a field, a genre, a discipline, an art form …don't waste your time and ours hooting at the crap! Go after the good stuff or leave it alone. This advice is often ignored by ideologues intent on destroying the reputation of analytic philosophy, sociology, cultural anthropology, macroeconomics, plastic surgery, improvisational theatre, television sitcoms, philosophical theology, massage therapy, you name it.
Let's stipulate at the outset that there is a great deal of deplorable, second-rate stuff out there, of all sorts. Now, in order not to waste your time and try our patience, make sure you concentrate on the best stuff you can find, the flagship examples extolled by the leaders of the field, the prize-winning entries, not the dregs. Notice that this is closely related to Rapoport's rules: unless you are a comedian whose main purpose is to make people laugh at ludicrous buffoonery, spare us the caricature.

BEWARE OF DEEPITIES

A deepity (a term coined by the daughter of my late friend, computer scientist Joseph Weizenbaum) is a proposition that seems both important and true – and profound – but that achieves this effect by being ambiguous. On one reading, it is manifestly false, but it would be earth-shaking if it were true; on the other reading, it is true but trivial. The unwary listener picks up the glimmer of truth from the second reading, and the devastating importance from the first reading, and thinks, Wow! That's a deepity.
Here is an example (better sit down: this is heavy stuff): Love is just a word.
Oh wow! Cosmic. Mind-blowing, right? Wrong. On one reading, it is manifestly false. I'm not sure what love is – maybe an emotion or emotional attachment, maybe an interpersonal relationship, maybe the highest state a human mind can achieve – but we all know it isn't a word. You can't find love in the dictionary!
We can bring out the other reading by availing ourselves of a convention philosophers care mightily about: when we talk about a word, we put it in quotation marks, thus: "love" is just a word. "Cheeseburger" is just a word. "Word" is just a word. But this isn't fair, you say. Whoever said that love is just a word meant something else, surely. No doubt, but they didn't say it.
Not all deepities are quite so easily analysed. Richard Dawkins recently alerted me to a fine deepity by Rowan Williams, the then archbishop of Canterbury, who described his faith as "a silent waiting on the truth, pure sitting and breathing in the presence of the question mark".
I leave the analysis of this as an exercise for you.