Search This Blog

Showing posts with label memory. Show all posts
Showing posts with label memory. Show all posts

Friday 21 January 2022

Pakistan: Towards a modern Riyasat-e-Madina

Nadeem F Paracha in The Friday Times

On January 17, an article written by PM Imran Khan appeared in some English and Urdu dailies. In it, the Pakistani prime minister shared his thoughts on ‘Riyasat-e-Madinah,’ or the first ‘Islamic state’ that came into being in early 7th-century Arabia. PM Khan wrote that Pakistan will need to adopt the moral and spiritual tenor of that state if the country was to thrive.

Even before he came to power in 2018, Khan had been promising to turn Pakistan into a modern-day Riyasat-e-Madinah. He first began to formulate this as a political message in 2011. However, this idea is not a new one. It has been posited previously as well by some politicians, and especially, by certain Islamist ideologues. Neither is there any newness in the process used by Khan to arrive at this idea. Khan took the same route as ZA Bhutto’s Pakistan People’s Party (PPP) took decades ago. In 1967, when the PPP was formed, its ‘foundation documents’ — authored by Bhutto and the Marxist intellectual J.A. Rahim — described the party as a socialist entity. To neutralise the expected criticism from Islamist groups, the documents declared that democracy was the party’s policy, socialism was its economy, and Islam was its faith.

The documents then added that by “socialism” the party meant the kind of democratic-socialism practiced in Scandinavian countries such as Norway, Sweden, Denmark and Finland; and through which these countries had constructed robust welfare states. But this did not impress Islamist outfits, especially the Jamat-i-Islami (JI). It declared the PPP as a party of ‘atheists.’ In 1969, JI’s chief Abul Ala Maududi, authored a fatwa declaring socialism as an atheistic idea. The next year, when the PPP drafted its first ever manifesto, the party explained that its aim to strive for democracy, a “classless society,” economic equality and social justice “flows from the the political and economic ethics of Islam.”

After coming to power in December 1971, the PPP began using the term “Musawat-e-Muhammadi” (social and economic equality preached and practiced by Islam’s holy Prophet [PBUH]). In 1973, a prominent member of the PPP, Sheikh Ahmed Rashid, declared that the economic system that Islam advocated and the one that was implemented in the earliest state of Islam was socialist. When a parliament member belonging to an Islamist party demanded that Islamic rituals be made compulsory by law “because Pakistan was made in the name of Islam,” Rashid responded by saying that the country was not made to implement rituals, but to adopt an “Islamic economy” which was “inherently socialist.”

Now let us see just how close all this is to the route that PM Khan and his Pakistan Tehreek-i-Insaf (PTI) took in formulating their concept of Riyasat-i-Madinah. In 2011, PTI and Khan suddenly rose to prominence as a party of urban middle-classes and the youth. In his speeches between 2011 and 2015, Khan was quite vocal in his appreciation of the Scandinavian welfare states. But, often, this appreciation was immediately followed by Khan declaring that the non-Muslim Scandinavians had uncannily followed Islamic ideals of social justice and economic equality better than the Muslims had (or do). Of course, he did not mention that Scandinavian countries are some of the most secular nation-states in the world, and that a strong secular-humanist disposition of their polities and politics played a major role in the construction of the welfare states that Khan was in such awe of.

As the 2018 elections drew near, Khan began to explain the concept of the European welfare state as a modern-day reflection of the 7th-century state that was formed in the city of Madinah. This notion was close to Bhutto’s Musawat-e-Muhamadi. But Bhutto and his PPP had claimed that the Islamic state in Madinah had a socialist economy, and that this alone should be adopted by Pakistan, because it was still relevant in the 20th century. This position had given the PPP enough space to remain secular in most other areas. But to Khan, if the Scandinavian model of the welfare state is adopted, and then supplemented by Islam’s moral, spiritual and political ethos in all fields and areas, this would result in the modern-day re-enactment of a 7th-century ‘Islamic state.’ Khan’s idea in this this context is thus more theocratic in nature.

Khan’s concept seemed to be emerging from how Pakistan was imagined by some pro-Jinnah ulema during the 1946 elections in British India. To Mr. Jinnah’s party, the All India Muslim League (AIML), the culture of Indian Muslims largely mirrored the culture of Muslims outside South Asia, particularly in Arabia and even Persia. But the politics and economics of India’s Muslim were grounded in India and/or in the territory that they had settled in 500 years ago. Therefore, the Muslim-majority state that the League was looking to create was to be established in this territory. The League’s Muslim nationalism was thus territorial. It was not to be a universal caliphate or a theocracy with imperial and expansionist aims. It was to be a sovereign political enclave in South Asia where the Muslim minority of India would become a majority, thus benefiting from the economic advantages of majoritarianism.

However, whereas this narrative – more or less – worked in attracting the votes of the Muslims of Bengal and Sindh during the 1946 polls, the League found itself struggling in Punjab, which was a bastion of the multicultural Union Party. The Congress, too, was strong here. Various radical Islamist groups were also headquartered in Punjab. They had rejected the League’s call for a separate country. They believed that it would turn the remaining Muslims in India into an even more vulnerable minority. The Islamists viewed the League as a secular outfit with westernised notions of nationalism and an impious leadership.

This is when some ulema switched sides and decided to support the League in Punjab. This is also when overt Islamist rhetoric was, for the first time, used by the League through these ulema, mainly in Punjab’s rural areas. The ulema began to portray Jinnah as a ‘holy figure,’ even though very few rural Punjabis had actually seen him. The well-known Islamic scholar Shabbir Ahmed Usmani, who left the anti-Jinnah Jamiat Ulema Islam Hind (JUIH) to support the League, began to explain the yet-to-be-born Pakistan as a “naya Madinah,” or new Madinah.

By this, Usmani meant the creation of a state that would be based on the model of the 7th century state in Madina. But, much to the disappointed of the pro-League ulema, the model adopted by Pakistan was largely secular and the Islam that the state espoused was carved from the ideas of ‘Muslim modernists’ such as the reformer Sir Syed Ahmad Khan (d.1898) and the poet-philosopher Muhammad Iqbal (d.1938), who urged Muslims to look forward with the aid of an evolved and rational understanding of Islam, instead of looking backwards to a romanticised past.

***

 
Khan often spoke about Riyasat-i-Madinah before he became PM. But the frequency of him doing so has increased in the last year and a half – or when his government began to truly unravel. Today, it is in deep crises and expected to either be eased out by the Parliament, or knocked out in a ruder manner before it completes its term in 2023. The economy is in shambles, inflation and unemployment rates are climbing, and so are tensions between the government and its erstwhile patrons, the military establishment.

Amidst the growing crises, Khan has spoken more about morality, ‘westernisation,’ and Islamophobia than on how his government is planning to address the mounting economic problems that the country is facing, and the consequential political quagmire that his government has plunged into. Yet, he still somehow found a reason – or for that matter, the audacity – to lecture the polity on the moral and spiritual principles of Riyasat-i-Madinah, or the kind of morally upright and pious state and society that he is dreaming of constructing. One wonders if he is planning to do this with the large IMF loans that his government has had to acquire to keep the country from going bankrupt?

Khan’s article on Riyasat-i-Madinah was censured by the opposition parties. They saw it as a political ploy by him to distract the people from the failures of his government. There is evidence that a PR company hired by Khan has been advising him to raise the frequency of his Islamic rhetoric. The purpose behind this could be what the opposition is claiming. It might also be about something personal. But for men such as Khan, the personal often becomes the political.

According to political scientist David O’Connell, it is crisis, not political convenience, that more often brings out religion in politicians. In his book God Wills It, O’Connell argues that when public opinion of political leaders begins to dwindle, or when a head of state or government is threatened, that is when one sees religious rhetoric appear.

In 1976, when the Bhutto regime was struggling to address economic problems caused by an international oil crisis that had pushed up inflation, and due to the regime’s own mismanagement of important economic sectors that it had nationalised, Bhutto decided to organise a grand ‘Seerat Conference’ in Karachi. The conference was organised to discuss and highlight the life and deeds of Islam’s Prophet (PBUH) and how these could be adopted to regenerate the lost glory of the Islamic civilisation. Khan did exactly the same, late last year.

Bhutto’s intended audience in this respect was the Islamists who he had uncannily emboldened by agreeing to their demand of constitutionally ousting the Ahmadiyya community from the fold of Islam. He believed that this would neutralise the threat that the Islamists were posing to his ‘socialist’ government. The demand had risen when the Islamist groups in the Parliament had asked the government to provide the constitution with the provision to define what or who was a Muslim. In 1973, the government had refused to add any such provision in the constitution. But the very next year in 1974, when a clash between a group of Ahmadiyya youth and cadres of the student-wing of JI caused outrage amongst Islamist parties, they tabled a bill in the National Assembly which sought to constitutionally declare the Ahmadiyya as a non-Muslim community.

Bhutto threatened to unleash the military against anti-Ahmadiyya agitators who had besieged various cities of Punjab. According to Rafi Raza, who, at the time, was a special assistant to the prime minister, Bhutto insisted that the Parliament was no place to discuss theological matters. In his book ZA Bhutto and Pakistan, Raza wrote that the Islamist parties retorted by reminding the PM that in 1973 the constitution had declared Pakistan an ‘Islamic republic’ – and therefore, parliamentarians in an Islamic republic had every right to discuss religious matters.

After much violence in Punjab and commotion in the National Assembly, Bhutto capitulated and allowed the bill to be passed. This also meant that parliamentarians now had the constitutional prerogative to define who was or wasn’t a Muslim. This would eventually lead to the 1985 amendments in Articles 62 and 63 of the constitution, proclaiming that only ‘pious’ Muslims can be members of the Parliament and heads of state and government. The man who had initiated this, the dictator Zia-ul-Haq, had already declared himself ‘Sadiq and Amin’ (honest and faithful).

In 1976, Bhutto’s Islamist opponents were deriding him as a ‘bad Muslim,’ because he had ‘loose morals,’ was an ‘alcoholic,’ and that his government was as bad at fixing the economy as it was in curbing the “rising trend of obscenity and immorality in the society.” So, with the Seerat Conference, Bhutto set out to exhibit his Islamic credentials and, perhaps, to also demonstrate that his regime may be struggling to fix the economy, but, at least, it was being headed by a ‘true believer.’ But this didn’t save him from being toppled in a military coup that was triggered by his opponents who, in 1977, had poured out to agitate and demand a government based on Shariah laws.

Khan is on a similar path. He had been ‘reforming’ himself ever since he retired in 1992 as a cricketing star, a darling of tabloid press, and a ‘playboy.’ From a lifestyle liberal who had spent much of his time stationed in the UK, playing cricket and hobnobbing with European and American socialites, he gradually began to refigure his image. After retirement from cricket at age 40, he was mostly seen with prominent military men such as General Hamid Gul, who had once been extremely close to the dictator Zia-ul-Haq.

Khan also began to have one-on-one meetings with certain ulema and Islamic evangelical groups. Khan’s aim was to bury his colourful past and re-emerge as an incorruptible born-again Muslim. But his past was not that easy to get rid of. It kept being brought up by the tabloids and also by Nawaz Sharif’s centre-right PML-N, which began to see him as a threat because Khan was trying to appeal to Sharif’s constituency. Sharif was a conservative and a protégé of Zia. In 1998, his second regime was struggling to fix an economy that had begun to spiral down after Pakistan conducted its first nuclear tests. This triggered economic sanctions against Pakistan, imposed by its major donors and trading partners, the US and European countries.

The crisis saw Nawaz formulate his own ‘Islamic’ shenanigans. His crusades against obscenity were coupled by his desire to be declared ‘the commander of the faithful’ (amir-ul-mominin). Instead, he was brought down by a coup in 1999. Unlike the coup against the Bhutto regime which was planed by a reactionary general, the one against Nawaz (by General Musharraf) was apparently staged to fix the economy and roll back the influence that the Islamists had enjoyed, especially during the Zia and Nawaz regimes.

Khan’s party initially supported the coup against Nawaz. But it pulled back its support when the PTI was routed in the 2002 elections. Khan began criticising Musharraf as an “American stooge” and “fake liberal.” Musharraf responded by claiming that Khan had asked him that he be made the prime minister. Musharraf then added that Khan’s ideas were “like those of a mullah.” One wonders whether this statement had annoyed Khan or delighted him. Because remember, he was trying his best to bury his glitzy past and convince everyone that he was now a pious gentleman who wanted to employ ‘true Islamic principles’ in the country’s politics and polity.

---

But weren’t many of these ‘principles’ already made part of the country’s constitution and penal code by the likes of ZA Bhutto, Zia and Nawaz?

From 1974 onwards, Pakistan started to become what the Canadian political scientist Ran Hirschl described as a “constitutional theocracy.” The phrase was initially coined by the French political scientist Oliver Roy for Iran’s post-revolution constitution. Hirschl expanded it in a 2010 book in which he studied the increasing Islamisation of constitutions in certain Muslim countries, and the problems these constitutions were facing in coming to terms with various contemporary political, legislative and social challenges.

Constitutional theocracies empower the Islamists even if they are a minority in the Parliament. This is quite apparent in Pakistan. According to Syed Adnan Hussain, Associate Professor of Religious Studies at Saint Mary’s University, even though most Islamists scoff at democracy, there were also some prominent Islamist ideologues who posited that constitutions, judicial review, legal codes and a form of democratic election could be integrated into an Islamic state. Abul Ala Maududi and Maulana Taqi Usmani were two such ideologues. They agreed to use whatever means were available to turn Pakistan into an Islamic state. And these included democratic institutions, processes and the constitution.

Involvement of the ulema in drafting the 1956 constitution was nominal, even though the constitution did declare the country an Islamic republic. Their contribution in drafting the 1962 constitution was extremely minimal. And even though, there were just 18 members of various Islamist parties in the National Assembly which came into being after 1971, their input increased during the drafting of the 1973 constitution. Their influence continued to grow. By 1991, the constitution had been greatly Islamised.

Therefore, even the more electorally strong non-Islamist parties have had to add various Islamist ideas in their armoury because as the American author Shadi Hamid wrote: “Private religious devotion (to Islamists) is inseparable from political action. Islam is to be applied in daily life, including in the public realm. And to fail to do so is to shirk one’s obligations towards God. Faith, or at least their faith, gives (the Islamists) a built-in political advantage.” It is this advantage that the non-Islamist politicians want to usurp. They frequently find themselves pressed to continue positioning themselves as equally pious champions of Islam. Khan is doing exactly that. Bhutto did so in the second half of his rule, and Nawaz during his second stint as PM. But, of course, this does not come naturally to non-Islamists. And the Islamists are never convinced. In fact, they see it as a way by non-Islamists to neutralise the political influence of the Islamists. Yet, in times of crisis, many non-Islamist heads of government in the country have curiously leaned towards religion, believing that by adopting an ‘Islamic’ demeanour, they would be able to pacify public anger towards their failing regimes.

This can be a desperate and last-ditch ploy to survive a fall. But on occasion, it can also be about a personal existentialist crisis – which makes it even worse. I believe Khan is a case in point. Indeed, there is an element of political amorality in his increasingly fervent moral rhetoric and religious exhibitionism. According to the British journalist and documentary filmmaker Adam Curtis, as the world continues to become more complicated than ever, political leaders are increasingly struggling to comprehend today’s complexities and, thus, failing to formulate and provide a coherent vision of the future. They are attempting to define the complexity of today’s realities in an overtly simple manner.

Driven by a demand to simplify modern-day complexities, the leaderships, instead of trying to figure out new ways forward, have begun to look backwards, promising to bring forth a past that was apparently better and less complicated. But the recollection of such pasts is often not very accurate, because it involves a nostalgia which is referred to as ‘Anemoia’, or a nostalgia for a time one has never known. A past that is not a lived experience. A past that is largely imagined.

Khan likes to talk about the 7th-century state in Madinah. But as the anthropologist Irfan Ahmad and the historian Patricia Crone have demonstrated, there was no clear concept of a state anywhere in pre-modern times, east or west. The idea of the state as we know it today, began to emerge after the 17th century and matured from the 19th century onwards. It is a European concept. What is more, according to Ahmad, the idea of an Islamic state is a 20th century construct. It is derived from an imagined memory. Pre-modern states were vastly different than what they became from the 19th century onwards. States in pre-modern times had extremely limited capacity or resources to regulate every aspect of life.

They were impersonal and mostly erected to collect taxes from the subjects so that landed elites and monarchs could sustain standing armies, mount their wars, and retain power. A majority of the subjects were left to their own devices, as long as they did not rebel. Conquered areas were mostly put in the hands of local leaders on the condition that they would remain loyal to the conquers. Ancient states in Muslim regions and in the regions that the Muslims conquered were no different. But 20th-century Islamic ideologues began to speak of creating Islamic states. According to Ahmad, the idea of an Islamic state was the result of how the concept of the modern state had begun to fascinate ideologues and politicians in India.

The Congress began to talk about an Indian state, the League began to work towards a Muslim-majority state, the socialists towards a socialist state, and Islamists like Maududi began musing about an Islamic state. Shabir Usmani and Maududi projected the idea and reality of an all-encompassing modern state as a way to explain the functions of the 7th-century Madinah state, as if it had functioned like a modern state, regulating the lives of its subjects with coded laws, interventions, constitutions and through other established state institutions. This was not the case. What is more, there was little or no scholarship in the premodern Muslim world on political ideas or philosophy. These would only begin to appear in the 14th century in the works of Arab scholar Ibn Khaldun.

PM Khan is thus dealing in anemoia. He, like 20th-century critics of modernity, is raging against its supposedly cold and mechanical disposition. But instead of offering something new, he is investing more effort in trying to revive romanticised pasts which did not exist in the shape that they are often remembered as. Khan’s failure and incompetence to address the mounting problems of the here and the now, and his insistence on creating a theocratic potpourri of schemes already exhausted by Islamist ideologues – and by heads of state and government such as Bhutto, Zia and Nawaz – may as well be the last nail in the coffin of a much-exploited idea that is almost entirely based on a politically motivated and largely imagined memory.

Friday 1 June 2018

I can make one confident prediction: my forecasts will fail

Tim Harford in The Financial Times 

I am not one of those clever people who claims to have seen the 2008 financial crisis coming, but by this time 10 years ago I could see that the fallout was going to be bad. Banking crises are always damaging, and this was a big one. The depth of the recession and the long-lasting hit to productivity came as no surprise to me. I knew it would happen. 


Or did I? This is the story I tell myself, but if I am honest I do not really know. I did not keep a diary, and so must rely on my memory — which, it turns out, is not a reliable servant. 

In 1972, the psychologists Baruch Fischhoff and Ruth Beyth conducted a survey in which they asked for predictions about Richard Nixon’s imminent presidential visit to China and Russia. How likely was it that Nixon and Mao Zedong would meet? What were the chances that the US would grant diplomatic recognition to China? Professors Fischhoff and Beyth wanted to know how people would later remember their forecasts. Since their subjects had taken the unusual step of writing down a specific probability for each of 15 outcomes, one might have hoped for accuracy. But no — the subjects flattered themselves hopelessly. The Fischhoff-Beyth paper was titled, “I knew it would happen”. 

This is a reminder of what a difficult task we face when we try to make big-picture macroeconomic and geopolitical forecasts. To start with, the world is a complicated place, which makes predictions challenging. For many of the subjects that interest us, there is a substantial delay between the forecast and the outcome, and this delayed feedback makes it harder to learn from our successes and failures. Even worse, as Profs Fischhoff and Beyth discovered, we systematically misremember what we once believed. 

Small wonder that forecasters turn to computers for help. We have also known for a long time — since work in the 1950s by the late psychologist Paul Meehl — that simple statistical rules often outperform expert intuition. Meehl’s initial work focused on clinical cases — for example, faced with a patient suffering chest pains, could a two or three-point checklist beat the judgment of an expert doctor? The experts did not fare well. However, Meehl’s rules, like more modern machine learning systems, require data to work. It is all very well for Amazon to forecast what impact a price drop may have on the demand for a book — and some of the most successful hedge funds use algorithmically-driven strategies — but trying to forecast the chance of Italy leaving the eurozone, or Donald Trump’s impeachment, is not as simple. Faced with an unprecedented situation, machines are no better than we are. And they may be worse. 

Much of what we know about forecasting in a complex world, we know from the research of the psychologist Philip Tetlock. In the 1980s, Prof Tetlock began to build on the Fischhoff-Beyth research by soliciting specific and often long-term forecasts from a wide variety of forecasters — initially hundreds. The early results, described in Prof Tetlock’s book Expert Political Judgement, were not encouraging. Yet his idea of evaluating large numbers of forecasters over an extended period of time has blossomed, and some successful forecasters have emerged. 

The latest step in this research is a “Hybrid Forecasting Tournament”, sponsored by the US Intelligence Advanced Research Projects Activity, designed to explore ways in which humans and machine learning systems can co-operate to produce better forecasts. We await the results. If the computers do produce some insight, it may be because they can tap into data that we could hardly have imagined using before. Satellite imaging can now track the growth of crops or the stockpiling of commodities such as oil. Computers can guess at human sentiment by analysing web searches for terms such as “job seekers allowance”, mentions of “recession” in news stories, and positive emotions in tweets. 

And there are stranger correlations, too. A study by economists Kasey Buckles, Daniel Hungerman and Steven Lugauer showed that a few quarters before an economic downturn in the US, the rate of conceptions also falls. Conceptions themselves may be deducible by computers tracking sales of pregnancy tests and folic acid. 

Back in 1991, a psychologist named Harold Zullow published research suggesting that the emotional content of songs in the Billboard Hot 100 chart could predict recessions. Hits containing “pessimistic rumination” (“I heard it through the grapevine / Not much longer would you be mine”) tended to predict an economic downturn. 

His successor is a young economist named Hisam Sabouni, who reckons that a computer-aided analysis of Spotify streaming gives him an edge in forecasting stock market movements and consumer sentiment. Will any of this prove useful for forecasting significant economic and political events? Perhaps. But for now, here is an easy way to use a computer to help you forecast: open up a spreadsheet, note down what you believe today, and regularly revisit and reflect. The simplest forecasting tip of all is to keep score.

Friday 18 May 2018

Struggling with revision? Here's how to prepare for exams more efficiently

Abby Young-Powell in The Guardian


If you’re one to put hours into revising for an exam only to be disappointed with the results, then you may need to rethink your revision methods. You could be wasting time on inefficient techniques, says Bradley Busch, a registered psychologist and director of InnerDrive. “You get people putting in lots of effort, but not in a directed way,” he says. Here are some of the common ways students unwittingly waste study time, and what experts recommend you do instead. 

Re-reading and highlighting notes

Re-reading and highlighting notes may feel like work, but it often won’t achieve much. The same goes for spending hours drawing up a revision timetable. Instead, psychologists recommend a technique called retrieval practice. This is anything that makes your brain work to come up with an answer. It can include doing quizzes, multiple choice tests, and past papers. “To really learn something, you’ve got to transfer information from working memory into long term memory, where you can store and later retrieve it,” says David Cox, a neuroscientist and journalist. “Committing something to long term memory isn’t easy, so it shouldn’t feel easy.”

Last-minute cramming

Beware of the planning fallacy, which is our tendency to underestimate how much time we really need to do something. It leads to sitting outside the exam hall with two hours to spare, desperately cramming. This is not an effective way to learn. “The information you gain quickly, you can lose quickly too,” says Busch.

The opposite of cramming is spacing, which is the practice of spacing out your revision over time, doing little and often. So one hour a day for seven days is better than cramming seven hours into one day, for example. It’s also good to incorporate interleaving into your revision. This is a fancy way of saying you should mix up your subjects during a revision session. “It forces you to think about the problem and the strategy you come up with,” says Busch.

Making a study playlist

Sifting through the recommended study playlists on Spotify, trying to work out which songs will help you to concentrate, is usually a waste of time. But while listening to music can help you relax, and some students may have “trained” themselves to concentrate with it on, it’s still better to study in silence, Cox says. “You’re never going to be as productive having music on in the background, because it’s preventing your brain from acting at maximum capacity.”

Checking your phone

We may check our phones as often as once every 12 minutes. Obviously, this is a major distraction. That’s not all: research has shown that just having your phone in sight when you revise is enough to negatively affect your concentration, even if you don’t use it. And it’s a common trap to fall into. “I usually have my phone on silent mode, but to be honest, if it’s there I always check it,” says Chiara Fiorillo, who studies at City, University of London. Ideally it’s best to banish your phone to another room altogether.

Friday 1 February 2013

Are Footballers cleverer than PhD students? Think again



Ability is dictated by what we need to succeed. A chimp would fare better than me in a jungle – that doesn't make it smarter
John Terry
'How can a test accurately measure something when there is no certainty as to what is being measured?' Photograph: Nick Potts/PA
A recent study has shown that footballers can perform better than PhD students on certain cognitive tasks. This is being interpreted in the mainstream media as evidence that footballers are smarter than PhD students. While this is something of a considerable extrapolation, it is a perfect example of how our views and ideas about what counts as "intelligence" are a lot more flexible than most would think.
Scientifically, there is no real consensus per se on what intelligence can be accurately defined as. IQ tests may seem like an obvious way to assess intelligence, but in psychology their use remains controversial. How can a test accurately measure something when there is no certainty as to what is being measured? When you've got demonstrating that intelligence is dependent on working memory capacity, or arguing whether it's supported by singular or multiple processes, you need to be reasonably intelligent to keep up with the varying theories about what that even means.
Intelligence is also strongly influenced by culture. What's considered smart in one culture could well be considered foolish in another. We are all guilty of this bias to some extent. In the UK, a detailed knowledge of science is considered intelligent by many, whereas a detailed knowledge of football usually isn't. But there's nothing to say someone's football knowledge isn't just as or more complex and diverse than someone's knowledge of science. But football is everywhere, you don't need a degree to know about it, children play it all the time – so an in-depth knowledge of it is, perhaps unfairly, not considered an achievement.
Of course, knowing a lot of detailed information about something is only part of intelligence. It's also important to consider how this information is used. This division is referred to by some as crystallised and fluid intelligence, or information you retain and your ability to use it, respectively. Think of it like a computer: you've got your hard drive (data storage) and your processor (data usage); you need both to create a truly useful device.
This is reflected in changes to the structure of the brain, as the brain adapts and dedicates more resources to this constantly occurring demand. Therefore, it shouldn't be surprising that professional footballers would be better at certain mental abilities than non-footballers.
Whatever you think of the sport, a professional football match is undoubtedly a challenging context to be in. With so many variables to consider in a constantly changing scenario, it would be hard enough to keep on top of without thousands of people screaming at you for various reasons. Footballers have to be able to do this if they wish to get to the top of their field, so of course they'd perform better in tests that assess rapid thinking, attention and any other ability that isn't so crucial for other disciplines.
Footballers are stereotyped as being a bit thick, based on their unrefined behaviour and lack of social/cultural awareness. But these things haven't exactly held them back, so why would they have learned otherwise? Our abilities and skills are largely dictated by what we need to do in order to succeed. A chimpanzee would be far better equipped than I to survive in the jungle and would undoubtedly perform better than me in tests that assessed this. Still, I wouldn't let one fill in my tax return.
Perhaps intelligence is the wrong term to use, perhaps it would be fairer to say footballers and PhD students have differing mental abilities. But which of these abilities is considered "intelligent" seems to be a lot more subjective than most people realise.

Tuesday 30 October 2012

'You Are Not So Smart: Why Your Memory is Mostly Fiction....



So you remember your wedding day like it was yesterday. You can spot when something is of high quality. You keep yourself well-informed about current affairs but would be open to debate and discussion, You love your phone because it's the best, right? Are you sure? David McRaney from Hattiesburg, Mississippi, is here to tell you that you don't know yourself as well as you think. The journalist and self-described psychology nerd's new book, You Are Not So Smart, consists of 48 short chapters on the assorted ways that we mislead ourselves every day. "The central theme is that you are the unreliable narrator in the story of your life. And this is because you're unaware of how unaware you are," says McRaney. "It's fun to go through legitimate scientific research and pull out all of the examples that show how everyone, no matter how smart or educated or experienced, is radically self-deluded in predictable and quantifiable ways." Based on the blog of the same name, You Are Not So Smart is not so much a self-help book as a self-hurt book. Here McRaney gives some key examples

Expectation

The Misconception: Wine is a complicated elixir, full of subtle flavours only an expert can truly distinguish, and experienced tasters are impervious to deception.
The Truth: Wine experts and consumers can be fooled by altering their expectations.
An experiment in 2001 at the University of Bordeaux had wine experts taste a red and white wine, to determine which was the best. They dutifully explained what they liked about each wine but what they didn't realise was that scientists had just dyed the same white wine red and told them it was red wine. The tasters described the sorts of berries and tannins they could detect in the red wine as if it really was red. Another test had them judge a cheap bottle of wine and an expensive one. They rated the expensive wine much more highly than the cheap, with much more flattering descriptions. It was actually the same wine. It's not to say wine-tasting is pointless, it's to show that expectation can radically change experience. Yes, these people were experts, but that doesn't mean they can't be influenced by the same things as the rest of us, whether it be presentation or advertising or price. This drives home the idea that reality is a construction of the brain. You don't passively receive the outside world, you actively construct your experience moment by moment.

The Texas Sharpshooter Fallacy

The Misconception: We take randomness into account when determining cause and effect.
The Truth: We tend to ignore random chance when the results seem meaningful or when we want a random event to have a meaningful cause.
Imagine a cowboy shooting at the side of a barn over and over again with a gun. The side of the barn fills up with holes. If you walk over and paint a bullseye around clusters of holes it will make it look like you have made quite a lot of correct shots. It's a metaphor for the way the human mind naturally works when trying to make sense out of chaos. The brain is very invested in taking chaos and turning it into order. For example, in America it's very popular to discuss how similar the Lincoln and Kennedy assassinations were. Elected 100 years apart, Lincoln was killed in the Ford theatre; Kennedy was in a Lincoln automobile made by Ford. They were both killed on a Friday, sitting next to their wives, by men with three names. And so on and so on. It's not spooky. People take hold of the hits but ignore the misses. They are pulled into the things that line up, and are similar or coincidental, but they ignore everything else that's not. The similarities are merely bullseyes drawn around the many random facts.

Confirmation Bias

The Misconception: Your opinions are the result of years of rational, objective analysis.
The Truth: Your opinions are the result of years of paying attention to information that confirmed what you believed, while ignoring information that challenged your preconceived notions.
Any cognitive bias is a tendency to think in one way and not another whenever your mind is on auto-pilot; whenever you're going with the flow. Confirmation bias is a tendency to pay attention to evidence that confirms pre-existing beliefs and notions and conclusions about life and to completely ignore other information. This happens so automatically that we don't even notice. Say you have a flatmate, and you are arguing over who does most of the housework, and both people believe that they do most of the work. What is really happening is that both people are noticing when they do the work and not noticing when they don't. The way it plays into most of our lives is the media that we choose to put into our brains; the television, news, magazines and books. We tend to only pick out things that line up with our pre-existing beliefs and rarely choose anything that challenges those beliefs. It relays the backfire effect, which is a cognitive bias where if we're presented with contradictory evidence, we tend to reject it and support our initial belief even more firmly. When people watch a news programme or pundit, they aren't looking for information so much as confirmation of what they already believe is going in.

Brand Loyalty

The Misconception: We prefer the things we own over the things we don't because we made rational choices when we bought them.
The Truth: We prefer the things we own because we rationalise our past choices to protect our sense of self.
Why do people argue over Apple vs Android? Or one car company versus another? After all, these are just corporations. Why would you defend a brand as if you are their PR representative? We believe that we prefer the things we own because we made these deep rational evaluations of them before we bought them, but most of the rationalisation takes place after you own the thing. It's the choosing of one thing over another that leads to narratives about why you did it, which usually tie in to your self-image.
There are at least a dozen psychological effects that play into brand loyalty, the most potent of which is the endowment effect: you feel like the things you own are superior to the things you don't. When you buy a product you tend to connect the product to your self-image, then once it's connected to your self-image you will defend it as if you're defending your own ego or belief structure.

The Misinformation Effect

The Misconception: Memories are played back like recordings.
The Truth: Memories are constructed anew each time from whatever information is currently available, which makes them highly permeable to influences from the present.
You might think your memory is a little fuzzy but not that it's completely inaccurate. People believe that memory is like a video or files stored in some sort of computer. But it's not like that at all. Memories are actually constructed anew each time that you remember something.
Each time you take an old activation sequence in your brain and re-construct it; like building a toy airplane out of Lego and then smashing the Lego, putting it back into the box, and building it again. Each time you build it it's going to be a little bit different based on the context and experience you have had since the last time you created it.
Oddly enough, the least remembered memory is the most accurate. Each time you bring it into your life you edit it a little more. In 1974 Elizabeth Loftus had people watch a film of two cars having a collision and divided them into groups. Asking each group the same question, she used a slightly different description: how fast were the cars going when they contacted, hit, bumped, collided or smashed? The more violent the wording, the higher they estimated the speed. The way in which questions were worded altered the memories subjects reported.
They weren't looking back to the memory of the film they watched, they were building a new experience based on current information. Memory is actually very malleable and it's dangerous to think that memory is a perfect recording of a past event.

'You Are Not So Smart: Why Your Memory is Mostly Fiction, Why You Have Too Many Friends on Facebook and 46 Other Ways You're Deluding Yourself' by David McRaney (Oneworld, £8.99)

Friday 18 May 2012

VISUALISATION

Wayne Rooney reveals visualisation forms important part of preparation

• Manchester United striker: 'I visualise scoring wonder goals'
• Says Finland forward Jari Litmanen was an inspiration
Sunderland v Manchester United - Premier League
Wayne Rooney say he lies in bed 'the night before the game and visualise myself scoring goals or doing well'. Photograph: Michael Regan/Getty Images
 
Wayne Rooney has revealed how since being a very young player he visualises game patterns and goalscoring situations to enhance his performance.

The Manchester United and England striker told ESPN: "Part of my preparation is I go and ask the kit man what colour we're wearing – if it's red top, white shorts, white socks or black socks. Then I lie in bed the night before the game and visualise myself scoring goals or doing well. You're trying to put yourself in that moment and trying to prepare yourself, to have a 'memory' before the game. I don't know if you'd call it visualising or dreaming, but I've always done it, my whole life.

"When I was younger, I used to visualise myself scoring wonder goals, stuff like that. From 30 yards out, dribbling through teams. You used to visualise yourself doing all that, and when you're playing professionally, you realise it's important for your preparation."

Asked about his abilities as a developing player with regard to his peers Rooney added: "You're a bit more advanced than the kids your age, so there are times on the pitch where you can see different things, but they can't obviously see it. So then you get annoyed – they can't calculate.

"It's like when you play snooker, you're always thinking three or four shots down the line. With football, it's like that. You've got to think three or four passes where the ball is going to come to down the line. And the very best footballers, they're able to see that before – much quicker than a lot of other footballers."

Jari Litmanen, the former Ajax and Liverpool No10, provided one source of inspiration for Rooney. "I enjoyed how he moved and got into space," he said. "And he was patient. If you looked at him, he always never looked like he was rushed doing anything. He always used to take his time. Then, when the opportunity came, he found the space to get the ball in the net.

"The more you do it, the more it works. You need to know where everyone is on the pitch. You need to see everything."

Thursday 27 October 2011

Is modern science Biblical or Greek?


By Spengler

The "founders of modern science", writes David Curzon in Jewish Ideas Daily [1] of October 18, "were all believers in the truths of the opening chapter in the Hebrew Bible. The belief implicit in Genesis, that nature was created by a law-giving God and so must be governed by "laws of nature," played a necessary role in the emergence of modern science in 17th-century Europe. Equally necessary was the belief that human beings are made in the image of God and, as a consequence, can understand these "laws of nature."

Curzon argues that the modern idea of "laws of nature" stems from the Bible rather than classical Greece, for "ancient Greeks certainly believed that nature was intelligible and that its regularities could be made explicit. But Greek gods such as Zeus were not understood to have created the processes of nature; therefore, they could not have given the laws governing these processes."

Is this just a matter of semantics? Is there a difference between the "Greek" concept of intelligibility, and what Curzon calls the biblical concept of laws of nature? After all, the achievements of Greek science remain a monument to the human spirit. The Greek geometer Eratosthenesin the third century BCE calculated the tilt of the earth's axis, the circumference of the earth, and (possibly) the earth's distance from the sun. Archimedes used converging infinite series to calculate the area of conic sections, approximating the calculus that Newton and Leibniz discovered in the 17th century.

An enormous leap of mind, though, separates Archimedes' approximations from the new mathematics of the 17th century, which opened a path to achievements undreamed of by the Greeks. Something changed in the way that the moderns thought about nature. But does the rubric "laws of nature" explain that change? Curzon is on to something, but the biblical roots of modern science go much deeper.

Before turning to the scientific issues as such, it is helpful to think about the differences in the way Greeks and Hebrews saw the world. The literary theorist Erich Auerbach famously contrasted Greek and Hebrew modes of thought [2] by comparing two stories: the binding of Isaac in Genesis 22, and the story of Odysseus' scar told in flashback (Odyssey, Book 19).

Homer's hero has returned incognito to his home on the island of Ithaca, fearful that prospective usurpers will murder him. An elderly serving woman washes his feet and sees a scar he had received on a boar hunt two decades earlier, before leaving for the Trojan War, and recognizes him. Homer then provides a detailed account of the boar hunt before returning to his narrative.

Homer seeks to bring all to the surface, Auerbach explained in his classic essay. "The separate elements of a phenomenon are most clearly placed in relation to one another; a large number of conjunctions, adverbs, particles, and other syntactical tools, all clearly circumscribed and delicately differentiated in meaning, delimit persons, things, and portions of incidents in respect to one another, and at the same time bring them together in a continuous and ever flexible connection; like the separate phenomena themselves, their relationships - their temporal, local, causal, final, consecutive, comparative, concessive, antithetical, and conditional limitations - are brought to light in perfect fullness; so that a continuous rhythmic procession of phenomena passes by, and never is there a form left fragmentary or half-illuminated, never a lacuna, never a gap, never a glimpse of unplumbed depths."

Auerbach adds, "And this procession of phenomena takes place in the foreground - that is, in a local and temporal present which is absolute. One might think that the many interpolations, the frequent moving back and forth, would create a sort of perspective in time and place; but the Homeric style never gives any such impression."

Stark and spare, by contrast, is the story of God's summons to Abraham to sacrifice his beloved son Isaac. Where Homer tells us everything, the Bible tells us very little. God speaks to Abraham, and Abraham says, "Here I am." Auerbach observes, "Where are the two speakers? We are not told. The reader, however, knows that they are not normally to be found together in one place on earth, that one of them, God, in order to speak to Abraham, must come from somewhere, must enter the earthly realm from some unknown heights or depths. Whence does he come, whence does he call to Abraham? We are not told."

Abraham and Isaac travel together. Auerbach writes, "Thus the journey is like a silent progress through the indeterminate and the contingent, a holding of the breath, a process which has no present, which is inserted, like a blank duration, between what has passed and what lies ahead, and which yet is measured: three days!" Auerbach concludes:
On the one hand, externalized, uniformly illuminated phenomena, at a definite time and in a definite place, connected together without lacunae in a perpetual foreground; thoughts and feeling completely expressed; events taking place in leisurely fashion and with very little of suspense. On the other hand, the externalization of only so much of the phenomena as is necessary for the purpose of the narrative, all else left in obscurity; the decisive points of the narrative alone are emphasized, what lies between is nonexistent; time and place are undefined and call for interpretation; thoughts and feeling remain unexpressed, are only suggested by the silence and the fragmentary speeches; the whole, permeated with the most unrelieved suspense and directed toward a single goal (and to that extent far more of a unity), remains mysterious and "fraught with background".
Literary analysis may seem an unlikely starting-point for a discussion of science. But the Hebrew Bible's embodiment of what Auerbach called "the indeterminate and the contingent" has everything to do with the spirit of modern science. This emerges most vividly in the difference between the Greek and Hebrew understanding of time, the medium through which we consider infinity and eternity.

What separates Archimedes' approximation from Leibniz' calculus? The answer lies in the concept of infinity itself. Infinity was a stumbling-block for the Greeks, for the concept was alien to what Auerbach called their "perpetual foreground." Aristotle taught that whatever was in the mind was first in the senses. But by definition infinity is impossible to perceive. In the very large, we can never finish counting it; in the very small (for example infinitely diminishing quantities), we cannot perceive it. Infinity and eternity are inseparable concepts, for we think of infinity as a count that never ends.

For the Greeks, time is merely the demarcation of events. Plato understands time as an effect of celestial mechanics in Timaeus, while Aristotle in the Physics thinks of time as nothing more than the faucet-drip of events. That is Homer's time, in Auerbach's account. Biblical time is an enigma. That is implicit in Genesis, as Auerbach notes, but explicit in the Book of Ecclesiastes. Greek time is an "absolute temporal present."

In Hebrew time, it is the moment itself that remains imperceptible. Here is Ecclesiastes 3:15 in the Koren translation (by the 19th-century rabbi Michael Friedländer): "That which is, already has been; and that which is to be has already been; and only God can find the fleeting moment." As I wrote in another context, [3] Rabbi Friedländer's translation probably drew upon the celebrated wager that Faust offered the Devil in Goethe's drama. Faust would lose his soul will if he attempted to hold on to the passing moment, that is, to try to grasp what only God can find. The impulse to grab the moment and hold onto it is idolatrous; it is an attempt to cheat eternity, to make ourselves into gods.

A red thread connects the biblical notion of time to modern science, and it is spun by St Augustine of Hippo, the 4th-century Church father and polymath. His reflection on time as relative rather than absolute appears in Book 11 of his Confessions. And his speculation on the nature of number in time takes us eventually to the modern conceptual world of Leibniz and the calculus Aristotle's description of time as a sequence of moments, in Augustine's view, leads to absurdities.

To consider durations in time, we must measure what is past, for the moment as such has no duration. Events that have passed no longer exist, which means that measuring past time is an attempt to measure something that is not there at all. Augustine argues instead that we measure the memory of past events rather than the past itself: ''It is in you, my mind, that I measure times,'' he writes. Our perception of past events thus depends on memory, and our thoughts about future events depend on expectation. Memory and expectation are linked by ''consideration.'' For ''the mind expects, it considers, it remembers; so that which it expects, through that which it considers, passes into that which it remembers.''

Time is not independent of the intellect in Augustine's reading. Expectation and memory, Augustine adds, determine our perception of distant past and future: ''It is not then future time that is long, for as yet it is not: But a long future, is 'a long expectation of the future,' nor is it time past, which now is not, that is long; but a long past is 'a long memory of the past.''' This is the insight that allows Augustine to link perception of time to the remembrance of revelation and the expectation of redemption.

A glimpse of what Augustine's theory of time implies for mathematics appears in his later book, Six Books on Music. I argued in a 2009 essay for First Things: [4]
In De Musica, Augustine seeks to portray ''consideration'' as a form of musical number, that is, numeri judiciales, ''numbers of judgment.'' These ''numbers of judgment'' bridge eternity and mortal time; they are eternal in character and lie outside of rhythm itself, but act as an ordering principle for all other rhythms. They stand at the head of a hierarchy of numbers that begins with ''sounding rhythms'' - the sounds as such - which are in turn inferior to ''memorized rhythms.''

Only the ''numbers of judgment'' are immortal, for the others pass away instantly as they sound, or fade gradually from memory over time. They are, moreover, a gift from God, for ''from where should we believe that the soul is given what is eternal and unchangeable, if not from the one, eternal, and unchangeable God?'' For that reason the ''numbers of judgment,'' by which the lower-order rhythms are ordered, do not exist in time but order time itself and are superior in beauty; without them there could be no perception of time. Memory and expectation are linked by the ''numbers of judgment,'' which themselves stand outside of time, are eternal, and come from God.
That is an intimation of a higher order of number. Because it is buried in a treatise on musical time, Augustine's idea about "numbers of judgment" has elicited scant scholarly interest. But it is clear that his "numbers of judgment" are consistent with his much-discussed theory of "divine illumination." He wrote in Confessions, "The mind needs to be enlightened by light from outside itself, so that it can participate in truth, because it is not itself the nature of truth. You will light my lamp, Lord."

Descartes' "innate ideas" and Kant's "synthetic reason" descend from Augustine, although Kant recast the concept in terms of hard-wiring of the brain rather than divine assistance. The founder of neo-Kantian philosophy, Hermann Cohen (1842-1918) built his career out of the insight that the fact that infinitesimals in the calculus add up to a definite sum proves the existence of something like synthetic reason. That is why Kant triumphed in philosophy and the Aristotelians were reduced to a grumpy band of exiled irredentists.

Augustine's idea finds its way into modern science through Cardinal Nicholas of Cusa (1401-1464). Theologian and mathematician, Cusa noticed that musicians were tuning their instruments to ratios that corresponded to irrational numbers. The "natural" intervals of music tuning clashed with the new counterpoint of the Renaissance, so the musicians adjusted (or "tempered") the intervals to fit their requirements.

The Greeks abhorred the notion of irrational number because they abhorred infinity. Aristotle understood that infinity lurked in the irrational numbers, for we can come infinitely close to an irrational number through an infinite series of approximations, but never quite get there. And the notion of an "actual infinity" offended the Greek notion of intelligibility. To medieval mathematicians, the irrationals were surds, or ''deaf'' numbers, that is, numbers that could not be heard in audible harmonic ratios. The association of rational numbers with musical tones was embedded so firmly in medieval thinking that the existence of an irrational harmonic number was unthinkable.

The practice of musicians, Cusa argued, overthrew Aristotle's objections. The human mind, Cusa argued, could not perceive such numbers through reason (ratio), ie the measuring and categorizing faculty of the mind, but only through the intellect (intellectus), which depended on participation (participatio) in the Mind of God.

Cusa's use of Augustinian terminology to describe the irrationals - numbers ''too simple for our mind to understand'' - heralded a problem that took four centuries to solve (and, according to the few remaining "Aristotelian realists," remains unsolved to this day).

Not until the 19th century did mathematicians arrive at a rigorous definition of irrational number, as the limit of an infinite converging sequence of rational numbers. That is simple, but our mind cannot understand it directly. Sense-perception fails us; instead, we require an intellectual leap to the seemingly paradoxical concept of a convergent infinite series of rational numbers whose limit is an irrational number.

The irrational numbers thus lead us out of the mathematics of sense-perception, the world of Euclid and Aristotle, into the higher mathematics foreshadowed by Augustine (see my article, ''Nicholas of Cusa's Contribution to Music Theory,'' in RivistaInternazionale di Musica Sacra, Vol 10, July-December 1989).

Once irrational numbers had forced their way into Western thinking, the agenda had changed. Professor Peter Pesic [5] recently published an excellent account of the impact of irrational numbers in musical tuning on mathematics and philosophy. [6]

Another two centuries passed before Leibniz averred, ''I am so in favor of the actual infinite that instead of admitting that nature abhors it, as is commonly said, I hold that nature makes frequent use of it everywhere, in order to show more effectively the perfections of its author.'' Theological concerns, one might add, also motivated Leibniz' work, as I sought to show in ''The God of the Mathematicians'' (First Things, August-September 2010).

Unlike Archimedes, who still thought in terms of approximations using rational numbers, Leibniz believed that he had discovered a new kind of calculation that embodied the infinite. Leibniz' infinitesimals (as I reported in ''God and the Mathematicians'') lead us eventually to George Cantor's discovery of different orders of infinity and the transfinite numbers that designate them; Cantor cited Cusa as well as Leibniz as his antecedents, explaining ''Transfinite integers themselves are, in a certain sense, new irrationalities. Indeed, in my opinion, the method for the definition of finite irrational numbers is quite analogous, I can say, is the same one as my method for introducing transfinite integers. It can be certainly said: transfinite integers stand and fall together with finite irrational numbers.''

Gilles DeLeuze (in Leibniz and the Baroque) reports that Leibniz ''took up in detail'' Cusa's idea of ''the most simple'' number: ''The question of harmonic unity becomes that of the 'most simple' number, as Nicolas of Cusa states, for whom the number is irrational. But, although Leibniz also happens to relate the irrational to the existent, or to consider the irrational as a number of the existent, he feels he can discover an infinite series of rationals enveloped or hidden in the incommensurable.'' Leibniz thus stands between Cusa in the fifteenth century and the flowering of the mathematics of infinite series in the nineteenth century. That is a triumph of the biblical viewpoint in modern science.

We can thus draw a red line from the Hebrew Bible (most clearly from Ecclesiastes) to Augustine, and through Nicholas of Cusa to G W Leibniz and the higher mathematics and physics of the modern world. The Hebrew Bible remains a force in modern science, despite the best efforts of rationalists and materialists to send it into exile.

Kurt Goedel, perhaps the greatest mathematician of the 20th century, approached all his work with the conviction that no adequate account of nature was possible without the presence of God. Inspired by Leibniz, Goedel destroyed all hope of a mechanistic ontology through his two Incompleteness Theorems as well as his work (with Paul Cohen) on the undecidability of the Continuum Hypothesis, as I reported in a recent First Things essay. [7]

There is always a temptation to offer simple homilies in honor of the Bible, for example, "intelligent design" theory, which in my view tells us nothing of real importance. An atheist like Spinoza also would contend that God designed the world, because in his philosophy God is the same thing as nature. Design contains no information about the unique and personal God of the Bible.

Curzon's discussion of the laws of nature is by no means wrong, but it would be wrong to leave the matter there. "The fear of God is the beginning of wisdom." As Ecclesiastes (3:11) said, "I have observed the task which God has given the sons of man to be concerned with: He made everything beautiful in its time; He also put an enigma [sometimes "eternity"] into their minds so that man cannot comprehend what God has done from beginning to end" (Ecclesiastes 3:11, Artscroll translation). Eternity is in our minds but the whole of creation is hidden from us. Steven Hawking has gone so far as to conjecture that something like Goedel's Incompleteness Principle might apply to physics as well as mathematics.

What divides Hebrews from Greeks, above all, is a sense of wonder at the infinitude of creation and human limitation. The Odyssey is intended to be heard and enjoyed; Genesis 22 is to be searched and searched again for layers of meaning that are withheld from the surface. The Greek gods were like men, only stronger, better-looking and longer lived, immortal but not eternal, and the Greeks emulated them by seeking become masters of a nature infested by gods. The Hebrews sought to be a junior partner in the unending work of creation. With due honor to the great achievements of the Greeks, modernity began at Mount Sinai.