Search This Blog

Showing posts with label politic. Show all posts
Showing posts with label politic. Show all posts

Tuesday 2 May 2023

Political lobbyists are pretending to be NGOs & fooling tax dept.

 Jaitirth Rao in The Print


There has been quite a bit of noise about the current dispensation being against what is referred to as “civil society”. One expects this kind of diatribe from illiberal Lefties. But such is the stranglehold of these ideas and ideologies that this slanted view has now started gaining wider traction. The principal objection seems to be that the Foreign Contribution Regulation Act 2010 is being weaponised against some NGOs. This and related issues are worth examining in some detail.

When the Congress-led UPA 2 introduced draconian provisions in the FCRA law in 2010, I had gone on record opposing it. My article on that issue is available in the public domain. I mention this because I want it to be clear that I am not the usual adversary — the “fascist” supporter of the FCRA.

The FCRA is supposed to regulate foreign contributions. It has a provision that if foreign funds are received by an NGO, then the latter is required to use it for its own charitable purposes. The funds are not to be diverted to other NGOs or charity organisations. Based on the advice of some dubious and clever chartered accountants, some NGOs, instead of making contributions to other non-profits — which they are now prohibited from doing — have come up with an “innovative” solution. They are “paying” other NGOs for “services”. These services are usually in the grey and ambiguous domain of “consultancy”. Now, clearly, the NGOs are trying to “indirectly” achieve what the law prohibits them from doing “directly”.

None of these NGOs are babes in the woods. They are acquainted with common law cases. There are hundreds of cases in the US, a country close to the purse strings of these NGOs, saying that it is impermissible to do indirectly what is not permitted directly. How can it be that if the Indian State invokes a common law principle so clearly enunciated in the US, it suddenly becomes a fascist enemy of decent NGOs? As it turns out, virtually all the regulatory action against foreign-funded NGOs has been for this reason. 

Don’t tread where MNCs failed

As someone who has dealt with tax authorities in nine different countries over the last 49 years, let me assure the clever chartered accountants advising these NGOs that corporations and banks have been experimenting with these devices and playing with these loopholes for decades and have rarely, if ever, succeeded. The amateurish attempts by these NGOs to fool the tax department are going to get them nowhere. Where large multinational corporations (MNCs) have failed, NGOs should not tread.

Several ill-advised NGOs have gone one step further. They have tried to pretend that contributions received from their foreign donors have not been donations but payments for the elusive consultancy services rendered by their Indian arms for their foreign payments. Such obviously foolish attempts are bound to get them into trouble. There is no point in complaining after the fact.

Foreign-funded NGOs are welcome in our country if they wish to perform “charitable” acts like helping the visually challenged, the terminally ill, or the differently abled. As a country, we have been reasonably kind in supporting causes like leprosy alleviation or livelihood creation, even if the ultimate aim behind these good deeds has been religious proselytisation. In this regard, we have gone against the dictums of MK Gandhi who vociferously opposed “do-good” missionaries. But when foreign-funded NGOs start getting involved in political lobbying in India, we have a problem.

Some of us are old enough to remember that the Central Intelligence Agency (CIA) subsidiary, the NGO known as the Congress for Cultural Freedom, funded Indian magazines like Quest in the ’50s and ’60s. Some of us have also read the testimony of Soviet Union archivist Vasili Mitrokhin who regularly made sure that more copies of Russian translations of Hindi poets were printed and “sold” than their Hindi originals. This too happened in the ’50s, ’60s, and ’70s. Again, some of us remember that the head of the Ford Foundation in Delhi could get on to Jawaharlal Nehru’s calendar easily and that some of our tragicomic policy initiatives came from this august institution. Foreign-funded NGOs trying to tell us what taxation policies we should follow are really pushing their luck. And that is exactly what several of them have done before and are doing right now. Fortunately, one of them is now under a regulatory scanner. The Indian State, as is usually the case, has been dilatory. But better late than never.

The anti-State menace

Foreign-funded NGOs and foreign media have been against the Indian State and any strong dispensation for more than 70 years now. They prefer pusillanimous clientelist governments in India. They pilloried Panditji for his soft stance with the Soviets during the 1956 Hungarian revolution. They are now upset that we are not as anti-Russia as they would like. They have also made a devil’s bargain with blatantly Islamist organisations such as the US-based Council on American-Islamic Relations (CAIR).

This is why they prefer to refer to Indian Muslim gangsters as politicians. They talk of trigger-happy police officers in India. There are, of course, no such officers in the US. They prefer to characterise the Citizenship (Amendment) Act 2019 as obnoxious and anti-Muslim. I beg to differ. The Act is in favour of persecuted religious minorities in India’s neighbouring countries. These NGOs and the media do not bleed for Sikh shopkeepers, Hindu girls, and Parsis in our neighbourhood. They support the quixotic “farmers’” agitation in India when everybody knows that it was a “middle-man” affair. And they are silent about Canada’s blatant persecution of its truckers.

Let us now revert to our own domestic uncivil society. Under the previous dispensation, a bunch of impractical Lefties got together. They had never run factories or created jobs but managed to ingratiate themselves with the powers that were and became members of the pompous National Advisory Council (NAC). Their “advice” usually resulted in the active sabotage of the intelligent policies that Manmohan Singh was trying to implement. One feels sorry for Singh, who had to constantly look over his shoulders to avoid being bitten by this overweening Dracula. The combined NGO menace got so bad that the hapless former PM, in an interview to Science journal, blamed American NGOs for sabotaging the India-US nuclear deal, which had the support of the elected governments of both countries.

The simple fact is that the so-called civil society NGOs, who had support from the NAC and who could defy Singh quite easily, are now defanged and stand without protection. All that they can do is write strong pieces in the English press in India and appeal to their patrons in foreign papers to give them some oxygen. There is an old English saying: “They say, let them say…”

Call them by their right name

It is interesting to note that for the illiberal Left, references to “civil society” almost invariably mean references to NGOs, many with explicit political agendas. Are Sangeetha Sabhas, Bhajan Mandalis, regional associations (like Kannada Sangha in Mumbai, Maratha Mandali in Chennai, Odiya Sahitya Sabha in Bengaluru, Durga Puja Association in Pune), and traditional charities (like the Red Cross, Saint Judes, National Association for the Blind) not part of civil society? If any of them run afoul of tax authorities, will there be any media coverage? The French traveller Alexis de Tocqueville makes reference to voluntary organisations as being central to the American democratic experience. To this day, more the three-quarters of the fire brigades in American small towns and suburbs are manned by volunteers. Churches and synagogues organise charitable activities. Rotary, Lions, and Giants clubs are part of civil society as also oddly enough is the Masonic Lodge.

All of these institutions derived their funding from members of their immediate physical communities. This is the civil society that de Tocqueville praised. He would be shocked if told that quasi-political lobbying groups who obtain money from foreign countries in order to influence American politics were to be referred to as members of the voluntary, citizen-supported civil society, which he held up as exemplars of grassroots democracy.

We need to get our vocabulary right and refer to political lobbyists by their correct name. Our ancients told us that getting the right “nama-rupa” or “word and form” will automatically make our arguments solid. When we revert to that tradition, it will be clear that genuine members of civil society are not complaining. Political lobbyists are indulging in grievance-mongering, which I hope and pray we quietly ignore.

Sunday 5 December 2021

The Long Shadow of Deobandism in South Asia

The new Taliban government in Afghanistan represents the realization of the 155-year-old Deobandi movement’s objective of establishing a regime led by Sunni clergy by Kamran Bokhari in newlinesmag.com


Illustration by Joanna Andreasson for New Lines


I might have been 11 when I first heard the word “Deobandi.” My family had just returned to Islamabad after eight years in New York, where my father served as a mid-ranking official at Pakistan’s mission to the United Nations. A year had passed since military ruler, Gen. Zia-ul-Haq, began subjecting Pakistan to his despotic Islamization agenda, and I was being exposed to a lot more than my brain could process. My dad despised Zia for two separate reasons. The first was obviously political. My father was a democrat and a staunch supporter of ousted Prime Minister Zulfikar Ali Bhutto, who had been executed after Zia’s 1977 coup. The second was religious. Our ancestors were from the Barelvi sect, which constituted the vast majority of Pakistanis at the time and has been the historical rival of the Deobandis, whom Zia had begun to capacitate to gain legitimacy for his regime. For the most part a secular individual, my father had always been passionate about our supposed ancestral lineage to medieval Sufi saints. Deobandis are Hanafi Sunni Muslims like Barelvis, but for him they represented local variants of the extremist brand known as Wahhabism, which originated in the Arabian Peninsula. And with Zia empowering their mullahs, mosques and madrassas, he thought it was his duty to protect his heritage, and I was given a crash course on the sectarian landscape.

While most Islamists in the Arab/Muslim world are more activists than religious scholars, in South Asia the largest Islamist groups are led by traditional clerics and their students. And the Deobandi sect has been in the forefront of South Asian Islamism, with the Taliban as its most recent manifestation. The Deobandis’ influence, reach and relevance in a vast and volatile region like South Asia is immense, yet they are little understood in the West. Western scholarship and commentary tend to be more focused on the movement’s counterparts in the Arab world, namely the Muslim Brotherhood and Wahhabi Salafism.

Deobandism was propelled by ulema lamenting at the loss of Muslim sovereignty in India. Different dynastic Muslim regimes had ruled over various regions in the subcontinent since the late 10th and early 11th century. The ulema had been part of the South Asian Muslim political elite, but their public role was always subject to a tug of war with the rulers and evolved over time.

They had a strong presence in the royal court from the time of the first Muslim sultanistic dynasty in the subcontinent: the Turkic Ghaznavids (977-1170), who broke off from the Persian Samanids (who themselves had declared their independence from the Abbasid Caliphate in Baghdad). It was during this era that the role of ulema began to change in that a great many of them from Central Asia invested in proselytization and spiritual self-discipline. This spiritual approach gained ground and distinguished itself from the legalistic approach of the ulema. The former took on a social and grassroots role while the latter continued to focus on directly influencing the sultan and, through his sultanate, the realm at large. Behind both movements were ulema who, to varying degrees, subscribed to Sufism. The difference was between those who swung heavily toward scriptural scholarship and those who were open to unorthodox ideas and practices in keeping with what they perceived as the need to accommodate local customs and exigencies. This divide would remain contained and the ulema would enjoy an elite status, which continued through the era of the Ghaurids (1170-1215) — an Afghan dynasty.

Essentially the ulema provided legitimacy for the rulers and in exchange received largesse and influence in matters of religion. It was under the Sultanate of Delhi (1206-1525) that the ulema were appointed to several official state positions, largely within the judiciary. In addition, a state law enforcement organ called hisbah was created for ensuring that society conformed to shariah, which is the origin for the modern-day agencies in some Muslim governments assigned the task of “promoting virtue and preventing vice.” It was an arrangement that allowed the ruler to keep the ulema in check and incapable of intruding into matters of statecraft.

After the Delhi Sultanate collapsed in 1526, it was replaced by another Turkic dynasty, the Mughals, under whom the ulema were marginalized. In his award-winning 2012 book, “The Millennial Sovereign: Sacred Kingship and Sainthood in Islam,” Azfar Moin, who heads the University of Texas at Austin’s Religious Studies Department, explains that during the reigns of Akbar (r. 1542-1605), his son Jehangir (r. 1605-27) and his grandson Shah Jehan (r. 1628-58), the ulema would remain in political wilderness.

It was Akbar’s great-grandson, Aurangzeb (r. 1658-1707), who not only restored the ulema to their pre-Akbar status but also radically altered the empire’s structure by theocratizing it. His Islamization agenda was a watershed moment, for it created the conditions in which the ulema would eventually gain unprecedented ground. What enabled the advance would be the fact that Aurangzeb was the last effective emperor, leading to not just the collapse of the Mughal empire but also the ascendance of British colonial rule. These two sequential developments would essentially shape the conditions in which Deobandism, and later on, radical Islamism, would emerge, as argued by Princeton scholar Muhammad Qasim Zaman explains in in his seminal 2007 book “The Ulama in Contemporary Islam: Custodians of Change.”

Over the course of the next two centuries, an ulema tendency that stressed the study of original Islamic sources and deemphasized the role of the rational sciences gained strength. Started by Shah Abdur Rahim, a prominent religious scholar in Aurangzeb’s royal court, this multigenerational movement was carried forward by his progeny, which included Shah Waliullah Delhawi, Shah Abdul Aziz and Muhammad Ishaq. This line of scholars represented the late Mughal era puritanical movement.

Delhawi, who was its most influential theoretician, was a contemporary of the founder of Wahhabism in the Arabian Peninsula, Muhammad ibn Abd al-Wahhab. The two even studied at the same time in Medina under some of the same teachers who exposed them to the ideas of the early 14th century iconoclastic Levantine scholar Ibn Taymiyyah. Salafism and Deobandism, the two most fundamentalist Muslim movements of the modern era, simultaneously emerged in the Middle East and South Asia, respectively. According to the conventional wisdom, the extremist views of Wahhabism spread from the Middle East to South Asia. In reality, however, Delhawi and Wahhabism’s founder drank from the same fountain in Medina — under an Indian teacher by the name of Muhammad Hayyat al-Sindhi and his student Abu Tahir Muhammad Ibn Ibrahim al-Kurani. A major legacy of Delhawi is Deobandism, which arose as Wahhabism’s equivalent in South Asia in the late 19th century. Similar circumstances led to the near simultaneous rise of the Muslim Brotherhood in the Middle East and Jamaat-i-Islami in South Asia in the early 20th century. These connections go to show how the two regions often influence each other in more significant ways than usually acknowledged.

For this clerical movement shaped by Delhawi, Muslim political decay in India was a function of religious decline, the result of the contamination of thought and practice with local polytheism and alien philosophies. Insisting that the ulema be the vanguard of a Muslim political restoration, these scholars established a tradition of issuing fatwas to provide common people with sharia guidance for everyday issues. Until then, such religious rulings had been largely the purview of the official ulema who held positions in the state. This group was responsible for turning the practice into a nongovernmental undertaking at a time when the state had become almost nonexistent. By the time Ishaq died in the mid-19th century, he had cultivated a group of followers including Mamluk Ali and Imdadullah Mujhajir Makki, who were mentors of the two founders of Deobandism, Muhammad Qasim Nanutavi and Rashid Ahmad Gangohi.

Renowned American scholar of South Asian Islam Barbara Metcalf in her 1982 book “Islamic Revival in British India: Deoband, 1860-1900” explains how the emergence of Deobandism was rooted in both ideological and practical concerns. It began when Nanutavi and Gangohi established the Dar-ul-Uloom seminary in the town of Deoband, some 117 miles north of Delhi, in 1866 — eight years after participating in a failed rebellion against the British conquest of India.

These two founders of the movement had already tried forming an Islamic statelet in a village called Thana Bhawan, north of Delhi, from where they sought to wage jihad against the British, only to be swiftly defeated. William Jackson explains in great detail, in his 2013 Syracuse University dissertation, the story of how the two formed a local emirate — a micro-version of the one achieved by the Taliban. Their mentor Makki became emir-ul-momineen, the Leader of the Faithful, and the two served as his senior aides — Nanutavi as his military leader, and Gangohi served as his judge. The tiny emirate was crushed by the British within a few months. Imdadullah fled to Mecca, Gangohi was arrested and Nanautavi fled to Deoband, where he sought refuge with relatives.

Realizing there was no way to beat the British militarily, Nanautavi sought to adopt the empire’s educational model and established a school attached to a mosque. His decision would be instrumental in shaping the course of history, ultimately helping to lay the groundwork for Indian independence, the creation of Pakistan and the rise of modern jihadist groups including the Afghan and Pakistani Taliban.

In Nanautavi’s point of view, European Christians were now masters of the land long ruled by Indian Muslims. He thus envisioned the seminary as an institution that would produce a Muslim vanguard capable of restoring the role of the ulema in South Asian politics and even raising it to unprecedented levels. His priority was religious revival and, after Gangohi was released from prison, the madrassa at Deoband became the nucleus for a large network of similar schools around the country.

After Nanautavi died in 1880, Mahmud Hassan, the first student to enroll in Dar-ul-Uloom, led the Deobandi movement. Hassan transformed the movement from focusing on a local concern to one with national and international ambitions. Students from Russia, China, Central Asia, Persia, Turkey, the Levant and the Arabian Peninsula came to study at the seminary under his leadership. By the end of the First World War, more than a thousand graduates had fanned out across India. Their main task was to expunge ideas and practices that had crept into Indian Muslim communities through centuries of interactions with the Hindu majority.

This quickly antagonized the pre-dominant Muslim tendency that was rooted in Sufi mysticism and South Asian Islamic traditions. This movement started to organize in response to the Deobandis, in another Indian town called Bareilly, and was led by Ahmed Raza Khan (1856-1921). The Barelvis, as the rival movement came to be known, viewed the Deobandis as a greater threat to their religion and country than British colonial rule. This rivalry continues to define religious and political dynamics till this day, across South Asia.

Although the Deobandis viewed India as Dar-ul-Harb (Dominion of War), they initially did not try to mount another armed insurrection. Instead, they opted for a mainstream approach to politics that called for Hindu-Muslim unity. The Barelvis, meanwhile, took up controversial positions that unintentionally helped the Deobandis gain support. In particular, a fatwa by the Barelvi leader, Ahmed Raza Khan, in which he ruled that the Ottoman Empire was not the true caliphate, angered many Indian Muslims and drove them closer to the pan-Islamic, anti-British vision of the Deobandis. In fact, given the success of the Deobandi movement on a sectarian level, it never really viewed the Barelvis as a serious challenge.

While Deobandis and Barelvis were in the making, so was a modernist Muslim movement led by Sir Syed Ahmed Khan (1817-98). A religious scholar turned modern intellectual, Sir Syed hailed from a privileged family during the late Mughal era and worked as a civil servant during British rule. From his point of view, Muslim decline was a direct result of a fossilized view of religion and a lack of modern scientific knowledge. Sir Syed would go on to be the leader of Islamic modernism in South Asia through the founding of the Aligarh University. The university produced the Muslim elite, which would, almost half a century after Sir Syed’s death, found Pakistan.

Sir Syed’s prognosis of the malaise affecting the Muslims of India was unique and clearly different from that of the long line of religious scholars who saw the problem as a function of the faithful having drifted away from Islam’s original teachings. The loss of sovereignty to the British combined with the rise of men like him who advocated a cooperative approach toward the British and an embracing of European modernity would lead the founders of Deoband to adopt their own pragmatic approach but one that laid heavy emphasis on religious education. The Deobandis viewed Sir Syed’s Islamic modernism as their principal competitor. In other words, the Aligarh movement also developed around a university — but one that emphasized Western secular education — represented a major challenge, and not just politically but also religiously in that it offered an alternative paradigm.

Barely half a century after its founding, the Deobandi movement had established seminaries across India, from present-day Bangladesh in the east to Afghanistan in the west. Such was its influence that in 1914, Khan Abdul Ghaffar Khan, the leader of the secular Pashtun Khudai Khidmatgars movement, visited the Dar-ul-Uloom. Ghaffar Khan, who would later earn the moniker “The Frontier Gandhi,” met the Deobandi leader Mahmud Hassan to discuss the idea of establishing a base in the Pashtun areas of northwest India, from which they could launch an independence rebellion against the British. Harking back to the armed struggle of their forerunners, the Deobandis, once again, tried their hand at jihad — this time on a transnational scale.

With the help of Afghanistan, Ottoman Turkey, Germany and Russia, Hassan sought to foment this insurrection, believing that Britain would be too focused on fighting the First World War on the battlefields of Europe to be able to deal with an uprising in India. The plan was ambitious but foolhardy. Hassan wanted to headquarter the insurgent force in Hejaz, in modern-day Saudi Arabia, with regional commands in Istanbul, Tehran and Kabul. He traveled to Hejaz, where he met with the Ottoman war minister, Anwar Pasha, and the Hejaz governor, Ghalib Pasha. The Ottomans strongly supported an Indian rebellion as a response to the British-backed Arab revolt against them. The plan failed in great part because the Afghan monarch, Emir Habibullah Khan, would not allow an all-out war against the British be waged from his country’s soil. Hassan, the Deoband leader, was ensconced in Mecca when he was arrested by the Hashemite ruler of the Hejaz, Sharif Hussein bin Ali, and handed over to the British. He was imprisoned on the island of Malta.

During the four years that Hassan was jailed, several key developments took place back home in India. The most important was the launch of the 1919 Khilafat (Caliphate) Movement by a number of Muslim notables influenced by Deobandism. As prominent historian of South Asian Islam, Gail Minault, argues in her 1982 book “Religious Symbolism and Political Mobilization in India,” the Khilafat Movement, which lobbied Turkey’s new republican regime to preserve the caliphate, was actually a means of mobilizing India’s Muslims in a nationalist struggle against the British. This would explain why the movement received the support of Mahatma Gandhi in exchange for backing his Non-Cooperation Movement against the British. At around the same time, several Deobandi ulema created Jamiat Ulema-i-Hind (JUH), which would become the formal political wing of the movement – engaging in a secular nationalist struggle.

When Hassan was released from prison and returned to India, Gandhi traveled to Bombay to receive him. Hassan went on to issue a fatwa in support of the Khilafat and Non-Cooperation movements, which was endorsed by hundreds of ulema. Under his leadership, the Deobandis also supported Gandhi’s candidacy for the presidency of the Indian National Congress. The move was in keeping with their point of view that the Hindu majority was not a threat to Islam and the real enemies were the British.

As the Deobandi movement pushed for Hindu-Muslim unity, it underwent another leadership change. Ill from tuberculosis, Hassan died in November 1920, six months after his release from prison. He was succeeded by his longtime deputy Hussain Ahmed Madani, who engaged in a major campaign calling for joint Hindu-Muslim action against the British. With Madani at the helm, the Deobandis argued that movements organized along communal lines played into the hands of the colonial rulers and advanced the idea of “composite nationalism.” A united front was needed to end the British Empire’s dominance. This view ran counter to the atmosphere of the times and, following the collapse of the Deobandis’ transnational efforts, the movement’s nationalist program also floundered.

The All-India Muslim League (AIML), headed by the future founder of Pakistan, Muhammad Ali Jinnah, was growing in strength and steering Indian Muslims toward separatism. At the same time, the Deobandis’ JUH and Gandhi’s Indian National Congress intensified their demand for Indian self-government. The situation came to a head with massive nationwide unrest in 1928. To defuse the situation, the British asked Indian leaders to put forth a constitutional framework of their own. In response, the Indian National Congress produced the Nehru Report, a major turning point for the Deobandis. The report by their erstwhile allies ignored the JUH demand for a political structure that would insulate Muslim social and religious life from central government interference. This led dissenting members of the JUH and among the wider Deobandi community to join AIML’s call for Muslim separatism.

While prominent Deobandi scholar Ashraf Ali Thanvi would initiate the break, it was his student, Shabbir Ahmed Usmani, who led the split. Usmani would spearhead a reshaping of the Deobandi religious sect and play a critical role in charting the geopolitical divide that still defines South Asia today. In 1939, Thanvi issued a fatwa decreeing that Muslims were obligated to support Jinnah’s separatist AIML. He then resigned from the Deoband seminary and spent the four remaining years of his life supporting the creation of Pakistan.

Thanvi and Usmani realized that if the Deobandis did not act, the Barelvis — already allied with the AIML — could outmaneuver them. Better organized and one step ahead of their archrivals, the Deobandis were able to position themselves as the major religious allies of the AIML. It is important to note, however, that many Deobandis remained loyal to Madani’s more inclusive approach. They viewed his stance as in keeping with the Prophet Muhammad’s Compact of Medina, which had ensured the cooperation of various non-Muslim tribes. In contrast, Usmani and the renegade Deobandis had long been deeply uncomfortable with the idea of Hindu-Muslim unity, which conflicted with their religious puritanism. When Usmani established Jamiat Ulema Islam (JUI) in 1945 as a competitor to Madani’s JUH, the deep schism within the Deoband movement had reached a point of no return. Usmani’s insurrection came at the perfect time for Jinnah, a secular Muslim politician with an Ismaili Shia background. Jinnah had long sought to weaken JUH’s opposition to his Muslim separatist project; the support of Usmani lent religious credibility to his cause: creating the state of Pakistan.

After partition in 1947, the spiritual home of the Deobandi movement remained in India, but Pakistan was now its political center. When they founded JUI, Usmani and his followers already knew that it was way too late in the game for their group to be the vanguard leading the struggle for Pakistan. The AIML had long assumed that mantle, but it was not too late for the JUI to lead the way to Islamizing the new secular Muslim state. In fact, Jinnah’s move to leverage the Islamic faith to mobilize mass demand for a secular Muslim homeland had left the character of this new state deeply ambiguous. Such uncertainty provided the ideal circumstances for JUI to position itself at the center of efforts to craft a constitution for Pakistan. In the new country’s first Parliament, the Constituent Assembly, JUI spearheaded the push for an “Islamic political system.”

The death of secularist Jinnah in September 1948 created a leadership vacuum, which helped JUI’s cause. As a member of the assembly, the JUI leader Usmani played a lead role in drafting the Objectives Resolution that placed Islam at the center of the constitutional process. The resolution stated that “sovereignty over the entire universe belongs to God Almighty alone and the authority which He has delegated to the State of Pakistan.” It went on to say that “the principles of democracy, freedom, equality, tolerance and social justice” must be followed “as enunciated by Islam.” Adopted in 1949, the Objectives Resolution marked a huge victory for JUI and other Islamists.

By mid-1952, JUI appeared to be on its way to achieving its objectives. Within months, however, the situation soured. Along with other Islamist groups, it launched a violent nationwide protest movement against the minority Ahmadiyya sect, believing that the move would enhance its political position. In response, the government imposed martial law. A subsequent government inquiry held JUI and the other religious forces responsible for the violence and even questioned the entire premise of the party’s demand for Pakistan to be turned into an Islamic state. Nevertheless, in March 1956, the country’s first constitution came into effect, formally enshrining Pakistan as an Islamic republic. Two and a half years later, however, the military seized power under Gen. Ayub Khan, who was determined to reverse the influence of the Deobandis and the growing broader religious sector. Khan would go on to decree a new constitution that laid the foundations of a secular modern state — one in which the Deobandis did not even achieve their minimalist goal of an advisory role.

The Deobandi movement went through another period of decline and transition during President Khan’s reign. It was in the late 1960s under Mufti Mahmud, a religious scholar-turned-politician from the Pashtun region of Dera Ismail Khan, near the Afghan border that JUI experienced a revival. After Khan allowed political parties to operate again in 1962, Mahmud became JUI’s deputy leader. In truth, though, he was now the real mover and shaker of the Deobandi party steering it towards alliances of convenience with secular parties. Pakistani historian Sayyid A.S. Pirzada, in his 2000 book “The Politics of the Jamiat Ulema-i-Islam Pakistan 1971-1977,” goes into detail on how Mahmud transformed JUI from a religious movement seeking to influence politics into a full-fledged political party participating in electoral politics.

By the time Khan was forced out of office by popular unrest in 1969, socio-economic issues had replaced religion as the driving force shaping Pakistani politics. Another general, Yahya Khan, took over as president and again imposed martial law, abrogating the entire political system his predecessor had put together over an 11-year period. Yahya held general elections in 1970, marking the country’s first free and fair vote. Both secular and left-leaning, the country’s two major parties, the Awami League and Pakistan Peoples Party (PPP), came in first and second place in the vote for Parliament with 167 and 86 seats respectively, while JUI won only seven seats.

By now the west-east crisis that had been brewing since the earliest days of Pakistan’s independence was reaching a critical point. The Awami League won all of its seats in East Pakistan while the PPP won all of its seats in the west of the country. The military establishment, meanwhile, refused to transfer power to the Awami League. This caused full-scale public agitation in the east, which quickly turned into a brutal civil war that led to the creation of Bangladesh from what had been East Pakistan.

The war, which killed hundreds of thousands, resulted in two major implications that would help the Deobandis regain much of the political space they had lost since the early 1950s. First, it seriously weakened the military’s role in politics and allowed for the return of civilian rule. Second, it helped JUI and other religious parties to argue that only Islam could bind together different ethnic groups into a singular national fabric.

Within days of the defeat in the December 1971 war, Gen Yahya’s military government came to an end and PPP chief Zulfikar Ali Bhutto became president. In March 1972, JUI chief Mahmud became chief minister of North-Western Frontier Province (NWFP), leading a provincial coalition government with the left-wing Pashtun ethno-nationalist National Awami Party (NAP). The JUI also was a junior partner with NAP in Baluchistan’s provincial government. JUI’s stint in provincial power, however, was cut short when President Bhutto in 1973 dismissed the NAP-JUI cabinet in Baluchistan, accusing it of failing to control an ethno-nationalist insurgency in the province. In protest, the Mahmud-led government in NWFP resigned as well. The Deobandi party then turned its focus to ensuring that the constitution Bhutto’s PPP was crafting would be as much in keeping with its Islamist ideology as was possible.

Well aware that the masses overwhelmingly voted on the basis of bread-and-butter issues as opposed to religion, JUI sought to prevent the ruling and other socialist parties from producing a charter that would seriously limit its share of power. JUI and the country’s broader religious right were able to capitalize on the fact that Bhutto was seeking national consensus for a constitution, which would strengthen a civilian political order led by his ruling PPP. He was thus ready for a quid pro quo with the JUI and other Islamists — conceding on a number of their demands to Islamize the charter in order to establish a parliamentary form of government.

Consequently, Pakistan’s current constitution, which went into effect in August 1973, declared Islam the state religion, made the Objectives Resolution the charter’s preamble, established a Council of Islamic Ideology to ensure all laws were in keeping with the Quran and the Sunnah, and established the criteria of who is a Muslim, among a host of other provisions. The following year the Deobandis and the broader religious right won another major victory in the form of the second amendment, which declared Ahmadis as non-Muslims.

By 1974, the government of PPP founder Zulfikar Ali Bhutto began to appropriate religion into its own politics. Over the next three years, nine parties with JUI in a lead role formed a coalition of Islamist, centrist and leftist factions in the form of the Pakistan National Alliance (PNA) to jointly contest the 1977 elections. The PNA campaign was trying to leverage the demand of the religious right to implement Nizam-i-Mustafa (System of Muhammad).

In an election marred by irregularities, the PPP won 155 seats while the opposition alliance took only 36. After three months of unrest in the wake of the results, Bhutto invited his opponents to negotiate; JUI chief Mahmud led the opposition in the talks. In an effort not to compromise politically, Bhutto sought to appease the Islamists culturally and moved to ban the sale and consumption of liquor, shut all bars, prohibit betting and replace Sunday with the Muslim holy day of Friday as the weekly sabbath. The negotiations were cut short when army chief Gen. Zia mounted a coup, ousting Bhutto and appropriated the Deobandi agenda of Islamization — all designed to roll back the civilianization of the state and restore the military’s role in politics.

Zia’s moves to Islamize society top-down naturally resonated significantly with the religious right. From their point of view, Zia was the very opposite of the country’s first military dictator, Ayub Khan, who had been an existential threat to the entire ulema sector. The Deobandis, however, were caught between their opposition to a military dictatorship and the need to somehow benefit from Zia’s religious agenda. Although he was known for being a religious conservative, Zia was first and foremost a military officer. While the entire raison d’être of the Deobandi JUI was to establish an “Islamic” state, the Zia regime weaponized both the religion of Islam and the ideology of Islamism to gain support for what was essentially a military-dominated political order.

The JUI saw itself as heir to a thousand-year tradition of ulema trying to ensure that Muslim sovereigns in South Asia were ruling in accordance with their faith. Albeit late in the game, it was also a key player in creating Pakistan, and more importantly, worked to ensure that the country’s constitution was Islamic. But now Zia, who had assumed the presidency, had engaged in a hostile takeover of not just the state but the entire Deobandi business model. This explains why Mahmud opposed Zia’s putsch and kept demanding that he stick to his initial pledge of holding elections, which the general kept postponing. Zia’s primary objective was to reverse Bhutto’s efforts to establish civilian supremacy over the military.

By the time Zia banned political parties in October 1979, JUI was struggling to deal with a new autocratic political order that was stealing its thunder. It was also in a state of unprecedented decline. An internal rift had emerged within the party between those opposing Zia’s military regime and those seduced by his Islamization moves. A year later, Mahmud died of a heart attack. Mahmud’s son, a cleric-politician named Fazlur Rehman, was accepted as the new JUI chief by many of the leaders and members of the Deobandi party. But others opposed the hereditary transition. This led to a formal split in the party between Jamiat Ulema-i-Islam — Fazlur Rehman (JUI-F) and Jamiat Ulema-i-Islam Sami-ul-Haq (JUI-S), named after Sami-ul-Haq, a cleric whose madrassah Dar-ul-Uloom Haqqania would soon play a lead role in the rise of militant Deobandism. JUI-F continued to oppose Zia’s martial law regime while the splinter Deobandi faction, JUI-S, became a major supporter of the military government.

The same year that Zia was Islamizing his military regime, three major events shook the Muslim world: the Islamist-led revolution in Iran, the siege of Mecca by a group of messianic Salafists and the Soviet invasion of Afghanistan. These three developments would prove to be a watershed for the Deobandi movement. Deobandis formed a major component of the Afghan Islamist insurgent alliance fighting the Soviet-backed communist government. Many of the leaders of the Afghan insurgent factions like Mawlawi Yunus Khalis, Mohammed Nabi Mohammedi and Jalaluddin Haqqani were Deobandis. From the early 1980s onward, the two JUI factions were involved in dual projects: supporting the creation of an Islamic state in Afghanistan through armed insurrection and the Islamization of Pakistan (though divided over how on the latter).

At the same time, Saudi Arabia supported the Afghan insurgency and began to step up its promotion of Wahhabism in Pakistan, partly as a response to the Mecca siege. The Deobandis benefited financially and ideologically from Riyadh’s support, leading to the emergence of new groups.

Already wary of how Iran’s clerical regime was exporting its brand of revolutionary Islamism, many Deobandis were influenced by the anti-Shia sectarianism embedded within their own discourse and now energized by proliferating Wahhabism. The Zia regime also had an interest in containing Iranian-inspired revolutionary ideas and supported anti-Shia Deobandi militant factions. In 1985, a group by the name of Sipah-e-Sahabah Pakistan was founded as a militant offshoot of JUI. It would later give way to Lashkar-e-Jhangvi, named after Haq Nawaz Jhangvi, a firebrand anti-Shia Deobandi cleric. Lashkar-e-Jhangvi remains notorious for horrific attacks targeting Pakistan’s Shia minority.

After a century of being a religious-political movement, in the 1980s Deobandism was increasingly militant. The anti-communist insurgency in Afghanistan and sectarian militancy in Pakistan were the two primary drivers increasingly steering many Deobandis toward armed insurrection. While the end of the Zia regime (with the dictator’s death in a plane crash in the summer of 1988) brought back civilian rule to the country, Deobandism was hurtling toward a violent trajectory.

By the early 1990s the Pakistani military had retreated to influencing politics from behind the scenes and no longer pursued a domestic Islamization program. The die, however, had been cast. The extremist forces that Zia had unleashed were now on autopilot, and his civilian and military successors were unable to rein in their growth. Deobandi seminaries continued to proliferate in the country, especially in the Pashtun-dominated areas of the northwest.

In 1993, another militant Deobandi faction demanding the imposition of sharia law emerged in country’s northwest by the name of Tehrik-i-Nifaz-i-Shariat-i-Muhammadi (Movement to Implement the Shariah of Muhammad) — or TNSM — led by Sufi Muhammad, a mullah who had studied at the Panjpir seminary, which was unique in that its Deobandism was heavily Salafized.

The decade long war in Afghanistan against the Soviets had significantly affected the Pakistani military and the country’s premier spy service, the Inter-Services Intelligence (ISI) directorate, which was managing the Afghan, Pakistani and other Arab/Muslim foreign fighters. Many ISI officers had gone native with the militant Deobandi and Salafist ideologies of the proxies that they were managing. By the dawn of the 1990s two unexpected geopolitical developments would accelerate the course of Deobandism toward militancy. First was the December 1991 implosion of the Soviet Union, which a few months later triggered the collapse of the Afghan communist regime. That in turn led to the 1992-96 intra-Islamist war in Afghanistan, which gave rise to the Taliban movement and its first emirate regime. Second was a popular Muslim separatist uprising that began in Indian-administered Kashmir in 1989. The Pakistani military’s efforts to leverage both developments exponentially contributed to the surge of radicalized and militarized Deobandism.

In Afghanistan, Pakistan supported the Taliban, a movement founded by militant Deobandi clerics and students. The military also deployed Islamist insurgent groups in Indian-administered Kashmir, many of which were ideologically Deobandi. They included Harkat-ul-Mujahideen, Harakat-ul-Ansar and Jaish-e-Mohammed. Toward the late 1990s, when the Taliban were in power in Kabul and hosting al Qaeda, these groups constituted a singular transnational ideological battle space stretched from Afghanistan through India. This was most evident after militants hijacked an Indian Airlines flight from Nepal and landed in Taliban-controlled Kandahar. There, the hijackers, enabled by the Pakistani-backed Taliban regime, negotiated with the Indian government for the release of Jaish-e-Mohammed founder Masood Azhar and two of his associates who had been imprisoned for terrorist activities in Kashmir.

After 9/11, the Pakistani security establishment lost control of its militant Deobandi nexus, which gravitated heavily toward al Qaeda that had itself relocated to Pakistan. The U.S. toppling of the Taliban regime forced Islamabad into a situation in which it was trying to balance support for both Washington and the Afghan Taliban. Meanwhile, just days before the U.S. began its military operations against the Taliban in October 2001, Jaish-e-Muhammad operatives attacked the state legislature in Indian-administered Kashmir. This was followed by an even more brazen attack on the Indian Parliament in New Delhi on Dec. 13. The Pakistanis were now under pressure from both the Americans and the Indians. As a result, Islamabad clamped down on the Kashmiri militant outfits. The decision of Pakistan’s then military ruler Gen. Pervez Musharraf to first side with the U.S. against the Taliban and then undertake an unprecedented normalization process with India led to Islamabad losing control over the Deobandi militant landscape. In fact, many of these groups would turn against the Pakistani state itself. There were several assassination attempts on Musharraf, including two back-to-back attacks carried out by rogue military officers within two weeks in December 2003. The radicalized Deobandis whom Pakistan cultivated as instruments of foreign policy in the ’80s and ’90s inverted the vector of jihad to target the very state that nurtured them.

Meanwhile, the country’s main Deobandi political group, JUI-F, remained a force. In the 2002 elections, it led an alliance of six Islamist parties called the Mutahiddah Majlis-i-Amal (United Action Council or MMA) that won 60 seats in Parliament — in great part due to the electoral engineering of the country’s fourth military regime. It also secured the most seats in the provincial legislatures in the old Deobandi stronghold of NWFP, forming a majority government there and a coalition government with the pro-Musharraf ruling party in Baluchistan. The Deobandi-led MMA governments in both western provinces enabled the rise of Talibanization in the Pashtun-regions along the border with Afghanistan. By the time the MMA government in the northwest completed its five-year term in late 2007, some 13 separate Pakistani Taliban factions had come together to form an insurgent alliance known as the Tehrik-i-Taliban Pakistan (TTP). The Deobandi-led government turned a blind eye to rising Talibanization, partly because it did not want to be seen as siding with the U.S. against fellow Islamists and partly because it feared being targeted by the jihadists. The latter fear was not unfounded given TTP’s several attempts to assassinate several JUI leaders including chief Fazlur Rehman and its Baluchistan supremo Muhammad Khan Sherani.

Militant Deobandism in the form of insurgents controlling territory and engaging in terrorist attacks all over the country would dominate the better part of the next decade. Taliban rebels seized control of large swaths of territory close to the Afghan border. The biggest example of this was the Taliban faction led by Mullah Fazlullah (the son-in-law of the TNSM founder), which took over NWFP’s large Swat district (as well as many parts of adjacent districts).

The Taliban had significant support even in the country’s capital as illustrated by the 2007 siege of the Red Mosque (Islamabad’s oldest and major house of worship). A group of militants led by the mosque’s Deobandi imam and his brother for nearly 18 months had been challenging the writ of the state in the country’s capital by engaging in violent protests, attacks on government property, kidnapping, arson and armed clashes with law enforcement agencies. An 8-day standoff came to an end when army special forces stormed the mosque-seminary complex leading to a 96-hour gun battle with well-armed and trained militants during which at least 150 people (including many women and children) were killed.

The TTP greatly leveraged popular anger over the military operation against the mosque. It unleashed a barrage of suicide bombings targeting high-security military installations including an air weapons complex, a naval station, three regional headquarters of the ISI, special forces headquarters, the army’s general headquarters, the military’s main industrial complex and many other civilian targets, which resulted in tens of thousands of deaths. It took nearly a decade of massive counterinsurgency and counterterrorism operations to claw back provincial and tribal territories that had fallen under TTP control. By the late 2010s, Pakistan’s security forces had forced Taliban rebels to relocate across the border in Afghanistan where the U.S., after 15 years of unsuccessfully trying to weaken the Afghan Taliban movement, was in talks with it.

Washington had hoped that its 2020 peace agreement with the Afghan Taliban would lead to a political process that could limit the jihadist movement’s influence after the U.S. departure. The dramatic collapse of the Afghan government in a little over a week in early August of this year, however, has left the Afghan Taliban as the only group capable of imposing its will on the country. The return of the Afghan Taliban to power in Afghanistan has a strong potential to energize like-minded forces in Pakistan, especially with the Islamic State having a significant cross-border presence and trying to assume the jihadist mantle from the Taliban. Further in the easterly direction, rising right-wing Hindu extremism in India empowered by the current government of Prime Minister Narendra Modi risks radicalizing Deobandism in its country of birth, in response to the targeting of the country’s 200 million Muslim minority.

The Deobandis began in India as a movement seeking to reestablish a Muslim religio-political order in South Asia — one led by ulema. After 80 years of cultivating a religious intellectual vanguard and aligning with the majority Hindu community in secular nationalist politics to achieve independence from British colonial rule, a major chunk of Deoband embraced Muslim separatism. Once that goal was realized in the form of the independent nation-state of Pakistan, the locus of Deobandism shifted to Islamizing the new Muslim polity. For the next three decades, the Deobandis tried to turn a state that was intended to be secular into an Islamic republic through constitutional and electoral processes. The ascent of an Islamist-leaning military regime coupled with regional geopolitics at the tail end of the Cold War fragmented and militarized the Deobandi phenomenon whose locus yet again shifted westward. After the 1980s, the movement increasingly reverted to its British-era jihadist roots through terrorism and insurgency. That process has culminated in the Taliban’s empowerment in Afghanistan and now threatens to destabilize the entire South Asian region.

Today, Afghanistan represents the center of gravity of South Asia’s most prominent form of Islamism. The movement that has long sought to establish an “Islamic” state led by ulema subscribing to a medieval understanding of religion has established the polity that its ideological forefathers had set out to achieve over a century and a half ago.

As the Taliban consolidate their hold over Kabul with dangerous implications for the entire South Asian region, my mind wanders back 40 years to when my father — driven by his own sectarian persuasions — first made me aware of Deobandism. I am amazed at just how rapidly this phenomenon has grown before my eyes. Suffering from dementia for almost a decade and a half, my father has been oblivious of this proliferation. I actually don’t remember the last time either he or I broached this topic with the other. Perhaps it is for the best that he is unaware of the extent to which those whom he opposed all his life have gained ground. I know it would pain him to learn that Deobandism has even contaminated his own Barelvi sect, as is evident from the rise of the Tehrik-i-Labbaik Pakistan, which is now the latest and perhaps most potent Islamist extremist specter to haunt the country of his and my birth.

Tuesday 22 December 2020

Modi is reaching out. AMU has a chance to take Muslims away from path of confrontation

Despite its characteristic boast, Aligarh could not chart a path for modernity and progress of Indian Muslims after Independence. PM Modi's centenary address is an opportunity opines NAJMUL HODA in The Print

 

Prime Minister Narendra Modi is going to address the centenary celebrations of the Aligarh Muslim University on 22 December. The event is online. If it wasn’t for Covid, he would likely be on campus. This is the first time since 1964 that a prime minister of India is going to address AMU. Fifty-six years is a long time, and except for Jawaharlal Nehru and Lal Bahadur Shastri, no other prime minister thought of visiting the university, even though Aligarh is only 120 km from the national capital, and AMU is a fully funded central university whose Visitor is the President of India.

Also watch---


---

The proposed address by PM Modi is a well-thought gesture full of symbolism. As he often emphasises, Modi is the prime minister of 130 crore Indians, which includes about 20 crore Muslims. He belongs to an ideological stream whose understanding of history is much different from how Muslims would look at India’s past and their role in it. Therefore, their idea of the present and the vision for the future remain equally contested. 

Past without a closure

The Aligarh Muslim University has been a part of that contentious history, and must face up to its history in the spirit of truth and reconciliation. Sir Syed Ahmad Khan, the founder of Muhammadan Anglo-Oriental (MAO) College, the institution which became AMU in 1920, is alleged to have propounded the two-nation theory. It’s another matter that if his successors had not totally buried his social and political ideas, alongside his religious thought, there was much in his speeches and writings to place him among the founders of composite nationalism.

It was the Muhammadan Educational Conference, the vehicle of the Aligarh Movement, which doubled up as the founding session of the Muslim League at Dhaka in 1906. MAO College hosted the League till 1912 before its headquarters were shifted to Lucknow. The politics of Muslim separatism was institutionalised in Aligarh, which, by the 1940s, had become, in Jinnah’s words, “the arsenal of Muslim India”. Later, poet Jaun Elia would quip that Pakistan was a prank played by the juveniles of Aligarh (“Pakistan — ye sab Aligarh ke laundoÅ„ ki shararat thi”).

That this practical joke, by its sheer thoughtless adventurism, turned out to be a monumental tragedy, which sundered the country into two and the Muslim community into three, is yet to be confronted by Aligarh. The inability to confront its past, and the ruse of feigning amnesia in this regard, has also led to the collateral forgetting of nationalist and progressive streams, which though not dominant, were nonetheless quite robust strands. 

That this could happen despite the fact that AMU is endowed with a Centre of Advanced Study in History is even more surprising. History has to be written no matter what be one’s methodology, analytical tools, philosophical inclination and ideological orientation. Developing an Akbar-Aurangzeb centric school of history may be a noble endeavour, and nobler may be the zeal to argue how secular were the Muslim rulers — Aurangzeb being the most secular of them all — but expatiating on secular nationalism of people like Zakir Hussain and Mohammad Habib, in the face of frenzied communalism, would be much better if one didn’t fight shy of calling the mainstream Muslim communalism on the campus by its name.

This could not happen because post-Partition, Aligarh was reassured and rehabilitated, but not reformed. It reflects the high-mindedness of independent India’s leadership, how they protected and preserved AMU even as most of its faculty and students deserted it for the greener pasture of their conquest, Pakistan. An un-critiqued and un-reformed Aligarh would continue to inhabit the same narrative as earlier. Thus, despite its characteristic boast, Aligarh could not chart a path for modernity and progress of Indian Muslims and their integration and mainstreaming in the national life. The emotional chasm between the two communities kept widening despite the increasing commingling of people. And so, instead of giving intellectual leadership to the Muslim community, to which AMU considered itself traditionally entitled, and for which it is statutorily mandated by its founding Act, Aligarh chose the regressive path.

On such politically momentous issues as Shah Bano and Triple Talaq, despite having the material wherewithal to come up with its own progressive position, Aligarh’s intellectual sterility made it toe the line of the ulema and the reactionary Muslim Personal Law Board, the very people against whose thought the university was founded. It became complicit in the cultural regression and political alienation of the Muslim community, and could not intervene when a second separatist movement got underway in the name of identity. On the question of Babri Masjid, the Aligarh academia adopted the Leftist line of limited technical correctness, oblivious of the fact that the issue had far deeper implications for the Muslims than the Leftist arguments could see them through. 

A chance for reconciliation

Now that Prime Minister Modi, staunch in his own ideological position, is going to address AMU in a grand gesture of reaching out, its symbolism should not be lost on anyone. More so, as this moment comes not very long after the physical assaults and vicious propaganda against the university by Right-wing groups that swear allegiance to the Modi government. Stigmatising AMU over Jinnah’s portrait, which hung, among many others, in the student union’s building for he was an honorary member of it, was malicious, even as the arguments against removing it were too untenable to be sustained, too nuanced to be understood and too disingenuous to be given any credence.

Be that as it may, notwithstanding every imaginable criticism of AMU, the fact could not be ignored that this campus has the highest concentration of modern, educated Muslims anywhere in the world. So, even if it couldn’t realistically boast of being the intellectual vanguard of Indian Muslims, in sheer quantitative terms, and in view of its historical legacy, it has an unsurpassable symbolic value for the Muslim community.

Now that the prime minister is reaching out, should Aligarh, on behalf of the Muslims, not grasp the extended hand of friendship and reconciliation? It is for Aligarh to decide whether it would wean away Muslims from the path of confrontation, which, if not shunned, is bound to bring an unimaginable catastrophe, or to put them on the path of conciliation as Sir Syed did with the British.

An alumnus of AMU, Mukhtar Masood, mentions an anecdote in his book Awaz-e Dost, wherein some time after Independence, while addressing the governor of Uttar Pradesh, Sarojini Naidu, in the Union Hall, a student said, “Ya to hum aapke bade dushman hain ya chhote bhai hain (we are either your big enemy or little brother).” The reality is simpler than this. Muslims are neither. They are equal as Indians and citizens. Let this opportunity not be missed.

Friday 21 April 2017

Online political advertising is a black box and democracy should be worried

Jasper Jackson in The Guardian


As your mind wearily contemplates being exposed to yet another political campaign, are your dreams haunted by battle buses, billboards and TV debates? Or is it Facebook, YouTube, Twitter and Google?

On the evidence of last year’s EU referendum, much of the campaigning, and much of the money spent on political advertising, will be online. And it will happen in a way that will be largely hidden from scrutiny by either the public or regulators.

During the referendum, Vote Leave spent £2.7m with one small Canadian digital marketing firm that specialises in political campaigns – Aggregate IQ. The sum was well over a third of Vote Leave’s total budget.

Two other campaign groups – both of which received large donations from the Leave campaign - gave Aggregate IQ a further £765,000, taking the total pumped through the company to almost £3.5m. Vote Leave director Dominic Cummings is quoted on the company’s website saying “We couldn’t have done it without them.”

Yet the invoices for the money they paid to Aggregate IQ, which were handed to the Electoral Commission, list vague jargon-filled specifications with little indication of how the ads were delivered. It may tell us Aggregate IQ were running a “targeted video app installed and display media campaign” but gives no clue about where those ads appeared or who saw them. Did most of the money go on Facebook or YouTube? Did they spend more money on reaching under 45s in Hull or pensioners in Canterbury? There’s no way of knowing, not least because the Electoral Commission doesn’t ask for the information.

Meanwhile Cambridge Analytica, the digital targeting experts part-owned by US billionaire Robert Mercer, were credited with super-charging the Leave.EU campaign, even getting a mention in a book about campaign by its chief funder Arron Banks. Yet according to filings with the Electoral Commission there was no paid relationship with the firm at all. The Electoral Commission is currently investigating, as is the Information Commissioner’s Office over the company’s use of data.

These two companies promise to sway the electorate using high-tech targeting of voters, yet not only does the Electoral Commission have little idea of how the money is being spent, but many of the different messages those campaigns show chosen sets of targets are hidden from the rest of us.

An ad in a newspaper or magazine, a billboard or tube poster, can be seen by anybody who happens to come across it. They are targeted in a blunt way, by location, readership etc, but who they are appealing to, the messages used and the money spent is clear for all to see.

But online, ads are directed at far more specific target groups, and shown only to them. Suspect someone is a bit racist? Show them pictures of dark skinned migrants lining up at a border. Know someone regularly visits Spain? Emphasise how much longer it will take to go through airport security.

Just as importantly, you can make sure that you don’t show the wrong ads to the wrong people. The racist dog whistle doesn’t get pushed at people likely to be from, or comfortable with, ethnic minorities. The lengthy customs checks don’t get shown to those with an all-consuming fear of terror attacks.

Of course, people will see ads that aren’t aimed at them online – the targeting is far from perfect - but the digital world allows paid-for political campaigning to split into numerous conversations that rarely overlap.

This combination of digital marketing firms that are required to reveal little about what they do, and digital ads that are different for each segment of the population, make political advertising online opaque in way traditional ads were not.

And the approach seems to work. A more sophisticated digital strategy is regularly cited by Cummings and other Leave campaigners as as example of how they outsmarted Remain. If you were planning how to win June’s election, you’d be mad not to pay close attention to how they did it, and do your best to replicate it. And that means as we approach yet another nationwide vote, it will be harder than ever to see what impact money and the political advertising it pays for is having on the result.

Thursday 20 October 2016

The cult of the expert – and how it collapsed

Led by a class of omnipotent central bankers, experts have gained extraordinary political power. Will a populist backlash shatter their technocratic dream?

Sebastian Mallaby in The Guardian

On Tuesday 16 September 2008, early in the afternoon, a self-effacing professor with a neatly clipped beard sat with the president in the Roosevelt Room of the White House. Flanked by a square-shouldered banker who had recently run Goldman Sachs, the professor was there to tell the elected leader of the world’s most powerful country how to rescue its economy. Following the bankruptcy of one of the nation’s storied investment banks, a global insurance company was now on the brink, but drawing on a lifetime of scholarly research, the professor had resolved to commit $85bn of public funds to stabilising it.

The sum involved was extraordinary: $85bn was more than the US Congress spent annually on transportation, and nearly three times as much as it spent on fighting Aids, a particular priority of the president’s. But the professor encountered no resistance. “Sometimes you have to make the tough decisions,”the president reflected. “If you think this has to be done, you have my blessing.”

Later that same afternoon, Federal Reserve chairman Ben Bernanke, the bearded hero of this tale, showed up on Capitol Hill, at the other end of Pennsylvania Avenue. At the White House, he had at least been on familiar ground: he had spent eight months working there. But now Bernanke appeared in the Senate majority leader’s conference room, where he and his ex-Wall Street comrade, Treasury secretary Hank Paulson, would meet the senior leaders of both chambers of Congress. A quiet, balding, unassuming technocrat confronted the lions of the legislative branch, armed with nothing but his expertise in monetary plumbing.

Bernanke repeated his plan to commit $85bn of public money to the takeover of an insurance company.

“Do you have 85bn?” one sceptical lawmaker demanded.

“I have 800bn,” Bernanke replied evenly – a central bank could conjure as much money as it deemed necessary.

But did the Federal Reserve have the legal right to take this sort of action unilaterally, another lawmaker inquired?

Yes, Bernanke answered: as Fed chairman, he wielded the largest chequebook in the world – and the only counter-signatures required would come from other Fed experts, who were no more elected or accountable than he was. Somehow America’s famous apparatus of democratic checks and balances did not apply to the monetary priesthood. Their authority derived from technocratic virtuosity.

When the history is written of the revolt against experts, September 2008 will be seen as a milestone. The $85bn rescue of the American International Group (AIG) dramatised the power of monetary gurus in all its anti-democratic majesty. The president and Congress could decide to borrow money, or raise it from taxpayers; the Fed could simply create it. And once the AIG rescue had legitimised the broadest possible use of this privilege, the Fed exploited it unflinchingly. Over the course of 2009, it injected a trillion dollars into the economy – a sum equivalent to nearly 30% of the federal budget – via its newly improvised policy of “quantitative easing”. Time magazine anointed Bernanke its person of the year. “The decisions he has made, and those he has yet to make, will shape the path of our prosperity, the direction of our politics and our relationship to the world,” the magazine declared admiringly.

The Fed’s swashbuckling example galvanized central bankers in all the big economies. Soon Europe saw the rise of its own path-shaping monetary chieftain, when Mario Draghi, president of the European Central Bank, defused panic in the eurozone in July 2012 with two magical sentences. “Within our mandate, the ECB is ready to do whatever it takes to preserve the euro,” he vowed, adding, with a twist of Clint Eastwood menace, “And believe me, it will be enough.” For months, Europe’s elected leaders had waffled ineffectually, inviting hedge-fund speculators to test the cohesion of the eurozone. But now Draghi was announcing that he was badder than the baddest hedge-fund goon. Whatever it takes. Believe me.

In the summer of 2013, when Hollywood rolled out its latest Superman film, cartoonists quickly seized upon a gag that would soon become obvious. Caricatures depicted central-bank chieftains decked out in Superman outfits. One showed Bernanke ripping off his banker’s shirt and tie, exposing that thrilling S emblazoned on his vest. Another showed the bearded hero hurtling through space, red cape fluttering, right arm stretched forward, a powerful fist punching at the void in front of him. “Superman and Federal Reserve chairman Ben Bernanke are both mild-mannered,” a financial columnist deadpanned. “They are both calm, even in the face of global disasters. They are both sometimes said to be from other planets.”

At some point towards the middle of the decade, shortly before the cult of the expert smashed into the populist backlash, the shocking power of central banks came to feel normal. Nobody blinked an eye when Haruhiko Kuroda, the head of Japan’s central bank, created money at a rate that made his western counterparts seem timid. Nobody thought it strange when Britain’s government, perhaps emulating the style of the national football team, conducted a worldwide talent search for the new Bank of England chief. Nobody was surprised when the winner of that contest, the telegenic Canadian Mark Carney, quickly appeared in newspaper cartoons in his own superman outfit. And nobody missed a beat when India’s breathless journalists described Raghuram Rajan, the new head of the Reserve Bank of India, as a “rock star”, or when he was pictured as James Bond in the country’s biggest business newspaper. “Clearly I am not a superman,” Rajan modestly responded.


No senator would have his child’s surgery performed by an amateur. So why would he not entrust experts with the economy?

If Bernanke’s laconic “I have 800bn” moment signalled a new era of central-banking power, Rajan’s “I am not a superman” wisecrack marked its apotheosis. And it was a high watermark for a wider phenomenon as well, for the cult of the central banker was only the most pronounced example of a broader cult that had taken shape over the previous quarter of a century: the cult of the expert. Even before Bernanke rescued the global economy, technocrats of all stripes – business leaders, scientists, foreign and domestic policy wonks – were enthralled by the notion that politicians might defer to the authority of experts armed with facts and rational analysis. Those moments when Bernanke faced down Congress, or when Draghi succeeded where bickering politicians had failed, made it seem possible that this technocratic vision, with its apolitical ideal of government, might actually be realised.

The key to the power of the central bankers – and the envy of all the other experts – lay precisely in their ability to escape political interference. Democratically elected leaders had given them a mission – to vanquish inflation – and then let them get on with it. To public-health experts, climate scientists and other members of the knowledge elite, this was the model of how things should be done. Experts had built Microsoft. Experts were sequencing the genome. Experts were laying fibre-optic cable beneath the great oceans. No senator would have his child’s surgery performed by an amateur. So why would he not entrust experts with the economy?

In 1997, the economist Alan Blinder published an essay in Foreign Affairs, the house journal of the American foreign policy establishment. His title posed a curious question: “Is government too political?”

Four years earlier, Blinder had left Princeton University, his academic home for two decades, to do battle in the public square as a member of President Bill Clinton’s Council of Economic Advisors. The way Blinder saw things, this was a responsibility more than a pleasure: experts had a duty to engage in public debates – otherwise, “the quacks would continue to dominate the pond”, as he had once written. Earnest, idealistic, but with a self-deprecating wit, Blinder was out to save the world from returning to that dark period in the Reagan era when supply-side ideologues ruled the roost and “nonsense was worshipped as gospel”. After two years at the White House and another two as vice chairman of the Fed, Blinder’s essay was a reflection on his years of service.

His argument reflected the contrast between his two jobs in Washington. At the White House, he had advised a brainy president on budget policy and much else, but turning policy wisdom into law had often proved impossible. Even when experts from both parties agreed what should be done, vested interests in Congress conspired to frustrate enlightened progress. At the Fed, by contrast, experts were gloriously empowered. They could debate the minutiae of the economy among themselves, then manoeuvre the growth rate this way or that, without deferring to anyone.

To Blinder, it was self-evident that the Fed model was superior – not only for the experts, but also in the eyes of the public. The voters did not want their members of Congress micromanaging technical affairs – polls showed declining trust in politicians, and it was only a small stretch to suggest that citizens wanted their political leaders to delegate as much as possible to experts. “Americans increasingly believe that their elected officials are playing games rather than solving problems,” Blinder wrote. “Political debate has too much ‘spin’ and too little straight talk.” In sum, too much meddling by elected politicians was a turn-off for the voters who elected them. It was a paradoxical contention.

Disaffection with the political mainstream in the America of the 1990s had created a yearning for white-hatted outsiders as potential presidential candidates: the billionaire businessman Ross Perot, who ran in 1992 and 1996; the anti-politician, Steve Forbes, whose signature proposal was to radically simplify America’s byzantine tax code. But rather than replace politicians with populist outsiders, whose grasp of public policy was suspect, Blinder advanced an alternative idea: the central-bank model of expert empowerment should be extended to other spheres of governance.

Blinder’s proposal was most clearly illustrated by tax policy. Experts from both political parties agreed that the tax system should be stripped of perverse incentives and loopholes. There was no compelling reason, for example, to encourage companies to finance themselves with debt rather than equity, yet the tax code allowed companies to make interest payments to their creditors tax-free, whereas dividend payments to shareholders were taxed twice over. The nation would be better off if Congress left the experts to fix such glitches rather than allowing politics to frustrate progress. Likewise, environmental targets, which balanced economic growth on the one hand and planetary preservation on the other, were surely best left to the scholars who understood how best to reconcile these duelling imperatives. Politicians who spent more of their time dialing for dollars than thinking carefully about policy were not up to these tasks. Better to hand them off to the technicians in white coats who knew what they were doing.


A dark question lurked in educated minds. If all the isms were wasms, if history was over, what good were politicians?

The call to empower experts, and to keep politics to a minimum, failed to trigger a clear shift in how Washington did business. But it did crystallise the assumptions of the late 1990s and early 2000s – a time when sharp criticisms of gridlock and lobbying were broadly accepted, and technocratic work-arounds to political paralysis were frequently proposed, even if seldom adopted. President Barack Obama’s (unsuccessful) attempt to remove the task of tackling long-term budget challenges from Congress by handing them off to the bipartisan Simpson-Bowles commission was emblematic of this same mood. Equally, elected leaders at least paid lip service to the authority of experts in the government’s various regulatory agencies – the Food and Drug Administration, the Securities and Exchange Commission, and so on. If they nonetheless overruled them for political reasons, it was in the dead of night and with a guilty conscience.

And so, by the turn of the 21st century, a new elite consensus had emerged: democracy had to be managed. The will of the people had its place, but that place had to be defined, and not in an expansive fashion. After all, Bill Clinton and Tony Blair, the two most successful political leaders of the time, had proclaimed their allegiance to a “third way”, which proposed that the grand ideological disputes of the cold war had come to an end. If the clashes of abstractions – communism, socialism, capitalism and so on –were finished, all that remained were practical questions, which were less subjects of political choice and more objects of expert analysis. Indeed, at some tacit, unarticulated level, a dark question lurked in educated minds. If all the isms were wasms, if history was over, what good were politicians?

 

Federal Reserve chairman Ben Bernanke testifies before Congress in October 2011. Photograph: Jim Lo Scalzo/EPA

For Blinder and many of his contemporaries, the ultimate embodiment of empowered gurudom was Alan Greenspan, the lugubrious figure with a meandering syntax who presided over the Federal Reserve for almost two decades. Greenspan was a technocrat’s technocrat, a walking, talking cauldron of statistics and factoids, and even though his ideological roots were in the libertarian right, his happy collaboration with Democratic experts in the Clinton administration fitted the end-of-history template perfectly. At Greenspan’s retirement in 2006, Blinder and a co-author summed up his extraordinary standing. They proclaimed him “a living legend”. On Wall Street, “financial markets now view Chairman Greenspan’s infallibility more or less as the Chinese once viewed Chairman Mao’s”.

Greenspan was raised during the Great Depression, and for much of his career, such adulation would have been inconceivable – for him or any central banker. Through most of the 20th century, the men who acted as bankers to the bankers were deliberately low-key. They spurned public attention and doubted their own influence. They fully expected that politicians would bully them into trying to stimulate the economy, even at the risk of inflation. In 1964, in a successful effort to get the central bank to cut interest rates, Lyndon Johnson summoned the Fed chairman William McChesney Martin to his Texas ranch and pushed him around the living room, yelling in his face, “Boys are dying in Vietnam, and Bill Martin doesn’t care!” In democracies, evidently, technocratic power had limits.

Through the 1970s and into the 1980s, central-bank experts continued to be tormented. Richard Nixon and his henchmen once smeared Arthur Burns, the Fed chairman, by planting a fictitious story in the press, insinuating that Burns was simultaneously demanding a huge pay rise for himself and a pay freeze for other Americans. Following in this tradition, the Reagan administration frequently denounced the Fed chief, Paul Volcker, and packed the Fed’s board with pro-Reagan loyalists, who ganged up against their chairman.


There were Alan Greenspan postcards, Alan Greenspan cartoons, Alan Greenspan T-shirts, even an Alan Greenspan doll

When Greenspan replaced Volcker in 1987, the same pattern continued at first. The George HW Bush administration tried everything it could to force Greenspan to cut interest rates, to the point that a White House official put it about that the unmarried, 65-year-old Fed chairman reminded him of Norman Bates, the mother-fixated loner in Hitchcock’s Psycho.

And yet, starting with the advent of the Clinton administration, Greenspan effected a magical shift in the prestige of monetary experts. For the last 13 years of his tenure, running from 1993 to 2006, he attained the legendary status that Blinder recognised and celebrated. There were Alan Greenspan postcards, Alan Greenspan cartoons, Alan Greenspan T-shirts, even an Alan Greenspan doll. “How many central bankers does it take to screw in a lightbulb?” asked a joke of the time. “One,” the answer went: “Greenspan holds the bulb and the world revolves around him.” Through quiet force of intellect, Greenspan seemed to control the American economy with the finesse of a master conductor. He was the “Maestro”, one biographer suggested. The New Yorker’s John Cassidy wrote that Greenspan’s oracular pronouncements became “as familiar and as comforting to ordinary Americans as Prozac and The Simpsons, both of which debuted in 1987, the same year President Reagan appointed him to office”.

Greenspan’s sway in Washington stretched far beyond the Fed’s core responsibility, which was to set interest rates. When the Clinton administration wanted to know how much deficit reduction was necessary, it asked Greenspan for a number, at which point that number assumed a talismanic importance, for no other reason than that Greenspan had endorsed it. When Congress wanted to understand how far deficit reduction would bring bond yields down, it demanded an answer from Greenspan, and his answer duly became a key plank of the case for moving towards budget balance. The Clinton adviser Dick Morris summed up economic policy in this period: “You figure out what Greenspan wants, and then you get it to him.”

Greenspan loomed equally large in the US government’s management of a series of emerging market meltdowns in the 1990s. Formally, the responsibility for responding to foreign crises fell mainly to the Treasury, but the Clinton team relied on Greenspan – for ideas and for political backing. With the Republicans controlling Congress, a Democratic president needed a Republican economist to vouch for his plans – to the press, Congress, and even the conservative talk radio host Rush Limbaugh. “Officials at the notoriously reticent Federal Reserve say they have seldom seen anything like it,” the New York Times reported in January 1995, remarking on the Fed chairman’s metamorphosis from monetary technocrat into rescue salesman. In 1999, anticipating the moment when it anointed Ben Bernanke its man of the year, Time put Greenspan on its cover, with smaller images of the Treasury secretary and deputy Treasury secretary flanking him. Greenspan and his sidemen were “economist heroes”, Time lectured its readers. They had “outgrown ideology”.

By the last years of his tenure, Greenspan’s reputation had risen so high that even fellow experts were afraid of him. When he held forth at the regular gatherings of central bank chiefs in Basel, the distinguished figures at the table, titans in their own fields, took notes with the eagerness of undergraduates. So great was Greenspan’s status that he started to seem irreplaceable. As vice-president Al Gore prepared his run for the White House, he pronounced himself Greenspan’s “biggest fan” and rated the chairman’s performance as “outstanding A-plus-plus”. Not to be outdone, the Republican senator John McCain wished the chairman could stay at his post into the afterlife. “I would do like we did in the movie Weekend at Bernie’s,” McCain joked during a Republican presidential primary debate. “I’d prop him up and put a pair of dark glasses on him and keep him as long as we could.”

How did Greenspan achieve this legendary status, creating the template for expert empowerment on which a generation of technocrats sought to build a new philosophy of anti-politics? The question is not merely of historical interest. With experts now in retreat, in the United States, Britain and elsewhere, the story of their rise may hold lessons for the future.

Part of the answer lies in the circumstances that Greenspan inherited. In the United States and elsewhere, central bankers were given space to determine interest rates without political meddling because the existing model had failed. The bullying of central banks by Johnson and Nixon produced the disastrous inflation of the 1970s, with the result that later politicians wanted to be saved from themselves – they stopped harassing central banks, understanding that doing so damaged economic performance and therefore their own reputations. Paul Volcker was a partial beneficiary of this switch: even though some Reagan officials attacked him, others recognised that he must be given the space to drive down inflation. Following Volcker’s tenure, a series of countries, starting with New Zealand, granted formal independence to their central banks. Britain crossed this Rubicon in 1997. In the United States, the Fed’s independence has never been formal. But the climate of opinion on monetary issues offered a measure of protection.

Healthy economic growth was another factor underpinning Greenspan’s exalted status. Globalisation, coupled with the surge of productivity that followed the personal computer revolution, made the 1990s a boom time. The pro-market policies that Greenspan and his fellow experts had long advocated seemed to be delivering the goods, not only in terms of growth but also in falling inequality, lower rates of crime, and lower unemployment for disadvantaged minorities. The legitimacy of experts relies on their presumed ability to deliver progress. In Greenspan’s heyday, experts over-delivered.

Yet these fortunate circumstances are not the whole story. Greenspan amassed more influence and reputation than anyone else because there was something special about him. He was not the sort of expert who wanted to confine politics to its box. To the contrary, he embraced politics, and loved the game. He understood power, and was not afraid to wield it.


Greenspan’s genius was to combine high-calibre expert analysis with raw political methods

Greenspan is regarded as the ultimate geek: obsessed with obscure numbers, convoluted in his speech, awkward in social settings. Yet he was far more worldly than his technocratic manner suggested. He entered public life when he worked for Nixon’s 1968 campaign – not just as an economic adviser, but as a polling analyst. In Nixon’s war room, he allied himself with the future populist presidential candidate Patrick Buchanan, and his memos to Nixon were peppered with ideas on campaign spin and messaging. In 1971, when Nixon went after the Fed chairman, Arthur Burns, Greenspan was recruited to coax Burns into supporting the president. In the mid-1970s, when Greenspan worked in the Gerald Ford administration, he once sneaked into the White House on a weekend to help rewrite a presidential speech, burying an earlier draft penned by a bureaucratic opponent. At the Republican convention in 1980, Greenspan tried to manoeuvre Ford on to Ronald Reagan’s ticket – an outlandish project to get an ex-president to serve as vice president.

Greenspan’s genius was to combine high-calibre expert analysis with raw political methods. He had more muscle than a mere expert and more influence than a mere politician. The combination was especially potent because the first could be a cover for the second: his political influence depended on the perception that he was an expert, and therefore above the fray, and therefore not really political. Unlike politician-politicians, Greenspan’s advice had the ring of objectivity: he was the man who knew the details of the federal budget, the outlook for Wall Street, the political tides as they revealed themselves through polling data. The more complex the problems confronting the president, the more indispensable Greenspan’s expertise became. “He has the best bedside manner I’ve ever seen,” a jealous Ford administration colleague recalled, remarking on Greenspan’s hypnotic effect on his boss. “Extraordinary. That was his favourite word. He’d go in to see Ford and say, ‘Mr President, this is an extraordinarily complex problem.’ And Ford’s eyes would get big and round and start to go around in circles.”

By the time Greenspan became Fed chairman, he was a master of the dark arts of Washington. He went to extraordinary lengths to cultivate allies, fighting through his natural shyness to attend A-list parties, playing tennis with potentially troublesome financial lobbyists, maintaining his contacts on Wall Street, building up his capital by giving valuable counsel to anyone who mattered. Drawing on the advantage of his dual persona, Greenspan offered economic advice to politicians and political advice to economists. When Laura Tyson, an exuberant Berkeley economist, was appointed to chair Bill Clinton’s Council of Economic Advisers, she was flattered to find that the Fed chairman had tips on her speaking style. Too many hand gestures and facial expressions could undermine her credibility, Greenspan observed. The CEA chairwoman should simply present facts, with as little visual commentary as possible.

Greenspan’s critics frequently complained that he was undermining the independence of the Fed by cosying up to politicians. But the critics were 180 degrees wrong: only by building political capital could Greenspan protect the Fed’s prerogatives. Clinton had no natural love for Greenspan: he would sometimes entertain his advisers with a cruel imitation of him – a cheerless old man droning on about inflation. But after a landmark 1993 budget deal and a 1995 bailout of Mexico, Clinton became a firm supporter of the Fed. Greenspan had proved that he had clout. Clinton wanted to be on the right side of him.

The contrast with Greenspan’s predecessor, the rumpled, egg-headed Paul Volcker, is revealing. Volcker lacked Greenspan’s political skills, which is why the Reagan administration succeeded in packing his board with governors who were ready to outvote him. When Greenspan faced a similar prospect, he had the muscle to fight back: in at least one instance, he let his allies in the Senate know that they should block the president’s candidate. Volcker also lacked Greenspan’s facility in dealing with the press – he refused to court public approval and sometimes pretended not to notice a journalist who had been shown into his office to interview him. Greenspan inhabited the opposite extreme: he courted journalists assiduously, opening presents each Christmas at the home of the Wall Street Journal’s Washington bureau chief, Al Hunt, flattering reporters with private interviews even as he berated other Fed governors for leaking to them. It was only fitting that, halfway through his tenure, Greenspan married a journalist whose source he had once been.

The upshot was that Greenspan maximised a form of power that is invaluable to experts. Because journalists admired him, it was dangerous for politicians to pick a fight with the Fed: in any public dispute, the newspaper columnists and talking heads would take Greenspan’s side of the argument. As a result, the long tradition of Fed-bashing ceased almost completely. Every Washington insider understood that Greenspan was too powerful to touch. People who got on the wrong side of him would find their career prospects dim. They would see their intellectual shortcomings exposed. They would find themselves diminished.


 
Mark Carney, the governor of the Bank of England, in 2015. Photograph: Jonathan Brady/AFP/Getty Images

Of course, the triumph of the expert was bound to be fragile. In democracies, the will of the people can be sidelined only for so long, and 2016 has brought the whirlwind. The Brexit referendum featured Michael Gove’s infamous assertion that “the British people have had enough of experts”. Since the vote, Mark Carney, the Bank of England governor once pictured as superman, has been accused by the government of running dubious monetary experiments that exacerbate inequality – an attack picked up by William Hague, who this week threatened the central bank with the loss of its independence unless it raised interest rates. In the United States, Donald Trump has ripped into intellectuals of all stripes, charging Fed chair Janet Yellen with maintaining a dangerously loose monetary policy in order to help Obama’s poll ratings.




Inside the Bank of England



Both Gove and Trump sensed, correctly, that experts were primed for a fall. The inflationary catastrophe sparked by 1970s populism has faded from the public memory, and no longer serves as a cautionary tale. Economies have recovered disappointingly from the 2008 crash – a crash, incidentally, for which Greenspan must share the blame, since he presided over the inflation of the subprime mortgage bubble. What little growth there has been has also passed most people by, since the spoils have been so unequally distributed. If the experts’ legitimacy depends on delivering results, it is hardly surprising that they are on the defensive.

And yet the history of the rise of the experts should remind us of three things. First, the pendulum will swing back, just as it did after the 1970s. The saving grace of anti-expert populists is that they do discredit themselves, simply because policies originating from the gut tend to be lousy. If Donald Trump were to be elected, he would almost certainly cure voters of populism for decades, though the price in the meantime could be frightening. In Britain, which is sliding towards a wreck of a divorce with its most important trading partners, the delusions and confusions of the Brexit camp will probably exact an economic price that will be remembered for a generation.

Second, Alan Blinder had a point: democratic politics is prone to errors and gridlock, and there is much to be said for empowering technocrats. The right balance between democratic accountability and expert input is not impossible to strike: the model of an independent central bank does provide a template. Popularly elected politicians have a mandate to determine the priorities and ambitions of the state, which in turn determine the goals for expert bodies – whether these are central banks, environmental agencies, or the armed forces. But then it behooves the politicians to step back. Democracy is strengthened, not weakened, when it harnesses experts.

Thirdly, however, if the experts want to hasten their comeback, they must study the example of Greenspan’s politicking. It is no use thinking that, in a democracy, facts and analysis are enough to win the day. As the advertising entrepreneur John Kearon has argued, the public has to feel you are correct; the truth has to be sold as well as told; you have to capture the high ground with a brand that is more emotionally compelling than that of your opponents. In this process, as Greenspan’s career demonstrates, the media must be wooed. Enemies must be undermined. And, if you succeed, your face might just appear on a T-shirt.

Two decades ago, in his final and posthumous book, the American cultural critic Christopher Lasch went after contemporary experts. “Elites, who define the issues, have lost touch with the people,” he wrote. “There has always been a privileged class, even in America, but it has never been so dangerously isolated from its surroundings.” These criticisms presciently anticipated the rise of Davos Man – the rootless cosmopolitan elite, unburdened by any sense of obligation to a place of origin, its arrogance enhanced by the conviction that its privilege reflects brains and accomplishment, not luck and inheritance. To survive these inevitable resentments, elites will have to understand that they are not beyond politics – and they will have to demonstrate the skill to earn the public trust, and preserve it by deserving it. Given the alternative, we had better hope that they are up to it.