Search This Blog

Friday, 16 December 2016

How Google's search algorithm spreads false information with a rightwing bias

Olivia Solon and Sam Levin in The Guardian


Google’s search algorithm appears to be systematically promoting information that is either false or slanted with an extreme rightwing bias on subjects as varied as climate change and homosexuality.


Following a recent investigation by the Observer, which uncovered that Google’s search engine prominently suggests neo-Nazi websites and antisemitic writing, the Guardian has uncovered a dozen additional examples of biased search results.

Google’s search algorithm and its autocomplete function prioritize websites that, for example, declare that climate change is a hoax, being gay is a sin, and the Sandy Hook mass shooting never happened.







The increased scrutiny on the algorithms of Google – which removed antisemitic and sexist autocomplete phrases after the recent Observer investigation – comes at a time of tense debate surrounding the role of fake news in building support for conservative political leaders, particularly US President-elect Donald Trump.

Facebook has faced significant backlash for its role in enabling widespread dissemination of misinformation, and data scientists and communication experts have argued that rightwing groups have found creative ways to manipulate social media trends and search algorithms.





Google alters search autocomplete to remove 'are Jews evil' suggestion



The Guardian’s latest findings further suggest that Google’s searches are contributing to the problem.

In the past, when a journalist or academic exposes one of these algorithmic hiccups, humans at Google quietly make manual adjustments in a process that’s neither transparent nor accountable.

At the same time, politically motivated third parties including the ‘alt-right’, a far-right movement in the US, use a variety of techniques to trick the algorithm and push propaganda and misinformation higher up Google’s search rankings.

These insidious manipulations – both by Google and by third parties trying to game the system – impact how users of the search engine perceive the world, even influencing the way they vote. This has led some researchers to study Google’s role in the presidential election in the same way that they have scrutinized Facebook.






Robert Epstein from the American Institute for Behavioral Research and Technology has spent four years trying to reverse engineer Google’s search algorithms. He believes, based on systematic research, that Google has the power to rig elections through something he calls the search engine manipulation effect (SEME).

Epstein conducted five experiments in two countries to find that biased rankings in search results can shift the opinions of undecided voters. If Google tweaks its algorithm to show more positive search results for a candidate, the searcher may form a more positive opinion of that candidate.

In September 2016, Epstein released findings, published through Russian news agency Sputnik News, that indicated Google had suppressed negative autocomplete search results relating to Hillary Clinton.

“We know that if there’s a negative autocomplete suggestion in the list, it will draw somewhere between five and 15 times as many clicks as a neutral suggestion,” Epstein said. “If you omit negatives for one perspective, one hotel chain or one candidate, you have a heck of a lot of people who are going to see only positive things for whatever the perspective you are supporting.”






Even changing the order in which certain search terms appear in the autocompleted list can make a huge impact, with the first result drawing the most clicks, he said.

At the time, Google said the autocomplete algorithm was designed to omit disparaging or offensive terms associated with individuals’ names but that it wasn’t an “exact science”.

Then there’s the secret recipe of factors that feed into the algorithm Google uses to determine a web page’s importance – embedded with the biases of the humans who programmed it. These factors include how many and which other websites link to a page, how much traffic it receives, and how often a page is updated. People who are very active politically are typically the most partisan, which means that extremist views peddled actively on blogs and fringe media sites get elevated in the search ranking.

“These platforms are structured in such a way that they are allowing and enabling – consciously or unconsciously – more extreme views to dominate,” said Martin Moore from Kings College London’s Centre for the Study of Media, Communication and Power.

Appearing on the first page of Google search results can give websites with questionable editorial principles undue authority and traffic.

“These two manipulations can work together to have an enormous impact on people without their knowledge that they are being manipulated, and our research shows that very clearly,” Epstein said. “Virtually no one is aware of bias in search suggestions or rankings.”

This is compounded by Google’s personalization of search results, which means different users see different results based on their interests. “This gives companies like Google even more power to influence people’s opinions, attitudes, beliefs and behaviors,” he said.

Epstein wants Google to be more transparent about how and when it manually manipulates the algorithm.


“They are constantly making these adjustments. It’s absurd for them to say everything is automated,” he said. Manual removals from autocomplete include “are jews evil” and “are women evil”. Google has also altered its results so when someone searches for ways to kill themselves they are shown a suicide helpline.

Shortly after Epstein released his research indicating the suppression of negative autocomplete search results relating to Clinton, which he credits to close ties between the Clinton campaign and Google, the search engine appeared to pull back from such censorship, he said. This, he argued, allowed for a flood of pro-Trump, anti-Clinton content (including fake news), some of which was created in retaliation to bubble to the top.

“If I had to do it over again I would not have released those data. There is some indication that they had an impact that was detrimental to Hillary Clinton, which was never my intention.”

Rhea Drysdale, the CEO of digital marketing company Outspoken Media, did not see evidence of pro-Clinton editing by Google. However, she did note networks of partisan websites – disproportionately rightwing – using much better search engine optimization techniques to ensure their worldview ranked highly.

Meanwhile, tech-savvy rightwing groups organized online and developed creative ways to control and manipulate social media conversations through mass actions, said Shane Burley, a journalist and researcher who has studied the alt-right.





“What happens is they can essentially jam hashtags so densely using multiple accounts, they end up making it trending,” he said. “That’s a great way for them to dictate how something is going to be covered, what’s going to be discussed. That’s helped them reframe the discussion of immigration.”

Burley noted that “cuckservative” – meaning conservatives who have sold out – is a good example of a term that the alt-right has managed to popularize in an effective way. Similarly if you search for “feminism is...” in Google, it autocompletes to “feminism is cancer”, a popular rallying cry for Trump supporters.

“It has this effect of making certain words kind of like magic words in search algorithms.”

The same groups – including members of the popular alt-right Reddit forum The_Donald – used techniques that are used by reputation management firms and marketers to push their companies up Google’s search results, to ensure pro-Trump imagery and articles ranked highly.

“Extremists have been trying to play Google’s algorithm for years, with varying degrees of success,” said Brittan Heller, director of technology and society at the Anti-Defamation League. “The key has traditionally been connected to influencing the algorithm with a high volume of biased search terms.”

The problem has become particularly challenging for Google in a post-truth era, where white supremacist websites may have the same indicator of “trustworthiness” in the eyes of Google as other websites high in the page rank.

“What does Google do when the lies aren’t the outliers any more?” Heller said.

“Previously there was the assumption that everything on the internet had a glimmer of truth about it. With the phenomenon of fake news and media hacking, that may be changing.”

A Google spokeswoman said in a statement: “We’ve received a lot of questions about autocomplete, and we want to help people understand how it works: Autocomplete predictions are algorithmically generated based on users’ search activity and interests. Users search for such a wide range of material on the web – 15% of searches we see every day are new. Because of this, terms that appear in Autocomplete may be unexpected or unpleasant. We do our best to prevent offensive terms, like porn and hate speech, from appearing, but we don’t always get it right. Aut
ocomplete isn’t an exact science and we’re always working to improve our algorithms.”

Islam is not a religion of peace

Taslima Nasreen

Wednesday, 14 December 2016

Is it the Populists who threaten Democracy - or is it the Technocrats?

Mark Blyth - Brown University

Is James Andersen an Alan Sked of English cricket?

Girish Menon

Image result for james anderson vs virat kohli


You might wonder what is the relationship between James Andersen the cricketer and Dr. Alan Sked the original founder of the UK Independence Party (UKIP)?  Prima facie, not a lot; one is a cricketer with not much connection with academia and the other is a tenured historian at the London School of Economics. But look closer and you can find both of them living in the past.

I attended Dr. Sked’s history lectures many moons ago. He was a fine orator and I fondly remember him after so many years, His pet theme was the greatness of the British Empire and the downward spiral of the UK since World War II especially with the increasing integration of erstwhile enemies into the European Union. At one of our social do’s we had the following conversation:

‘Alan, the UK needs a clock that rotates backwards’
‘Why?’ he asked
‘Because you seem to be forever living in the past’
‘Girish, do you know who you are talking to? I will be marking your papers in the summer’
‘Alan I am not from colonial India, I am from a more confident India’….

I had been out of touch with Dr. Sked until his proposal to start a UKIP of the left – however this proposal did not see the light of day at least not in the form Dr. Sked envisaged. Today's early morning reverie however linked Dr. Sked with James Andersen a great English bowler. Andersen, whose career appears fast fading, criticised the Indian captain Virat Kohli on the day he scored 235 runs. Kohli’s over 600 runs in four test matches has Andersen unimpressed. He suggested that Kohli is not so much an improved batsman, as a batsman playing in conditions that do not exploit his "technical deficiencies".

"I'm not sure he's changed," Anderson said. "I just think any technical deficiencies he's got aren't in play out here. The wickets just take that out of the equation.
"We had success against him in England, but the pace of the pitches over here just take any flaws he has out of the equation. There's not that pace in the wicket to get the nicks, like we did against him in England with a bit more movement. Pitches like this suit him down to the ground.”
"When that's not there, he's very much suited to playing in these conditions. He's a very good player of spin and if you're not bang on the money and don't take your chances, he'll punish you. We tried to stay patient against him, but he just waits and waits and waits. He just played really well."

Andersen, like Dr. Sked, loves to invoke the past when he does not wish to deal with the current reality. Virat Kohli may indeed fail on his next trip to England in 2018 on England’s doctored pitches. But Andersen could be a little less churlish, live in the present and share some of the Yuletide spirit.

Can technology replace teachers? You asked Google – here’s the answer

Harpreet Purewal in The Guardian


Anxiety about losing your job to technology is both a rational and growing fear. Andy Haldane, the chief economist at the Bank of England, recently estimated that 15m jobs in the UK were threatened by automation. Technology is reaching such levels of sophistication that it is capable not only of manual tasks but cognitive ones too, putting a wide range of jobs are at risk. The areas most vulnerable include driving and administrative work. But according to a report from Oxford University that looked at over 700 areas of work, teaching at all levels across the educational spectrum is a safe bet.

Yet the apparent safety of teaching as a profession doesn’t quite square with the boom in online courses. From the comfort of my sofa I can watch lectures from prestigious universities around the world, join the hundreds of millions of people who have enrolled on a Khan Academy course, enrol in a Mooc – a massive online open course – or upskill and change my career with a course from Lynda and many other education providers.

A lot of these courses are free, but those with accreditation attached tend to charge. The appeal for educational institutions is simple: you can pay a teacher once to deliver a lecture to an unlimited amount of students without having to pay for all the overheads it takes to run a building. Students are offered flexibility and can learn at a time and location that suits them. However, drop-out rates for these courses are extremely high and they present no real threat to education as we know it. It seems students still prefer a real classroom.


FacebookTwitterPinterest ‘The act of teaching isn’t just imparting what’s in your head to a captive audience.’ Photograph: Alamy Stock Photo

So why not replace teachers in classroom with technology? To understand why teachers’ careers are safe we need to ask two questions: what do teachers do all day and where does technology fall short?
A quick survey of teacher friends answers the first question: teachers provide pastoral care, direct the Christmas play, recognise and assist vulnerable pupils, cover break-time duty, mentor new teachers, collate data about pupils’ attendance and behaviour, mark homework, rig lights and dress sets for school performances, order resources such as textbooks and classroom equipment, write newsletters, take school trips, assess pupil attainment, meet parents, spot potential terrorists (ahem) in accordance with the government’s Prevent guidelines, lead assemblies, make endless photocopies, and appraise other members of staff. This list is incomplete and already sounds like a lot for a piece of technology to cover. But if you’re looking for an easy and long-term job, this isn’t it: almost a third of teachers quit within five years.

It’s likely that some of the administrative tasks that teachers do will be conducted by technology in the future, just as in other sectors, but what about the actual teaching? The act of teaching isn’t just imparting what’s in your head to a captive audience. Teaching is a performance, it’s reading the room and working it. This is where technology really falls short. Empathy is a key area of difficulty for technology and automation. Are the kids at the back of the classroom bored because you’re talking about something they find too difficult, because they know it already, or because you’re not presenting the information in a meaningful way? Human beings are able to pick up on a multitude of contextual clues to determine and respond to the emotional states of others. Technology can’t detect emotional states, let alone adapt its behaviour to cater accordingly.



‘The best teachers will use technology in the classroom as part of an expanding toolkit.’ Photograph: Paul Miller/AAP

Another area of difficulty for technology that is key to teaching is quick thinking. Any number of things can and do go wrong on a school day: a guest speaker cancels, the whiteboard freezes, buses are delayed or – the ultimate horror – the photocopier breaks. Human beings are able to think on their feet and reformulate their plans to adapt to new circumstances. Machines aren’t able to do this. Thinking on the spot is a key skill of teachers, and many cite the variety of the job as a reason for entering the profession in the first place.

We know what technology can’t do for students and teachers, but there are some reasons to be optimistic about the role of technology in education. Teachers in the UK often complain about the administration workload interfering with the actual work of teaching. Technology could aid data-gathering significantly, freeing up teachers’ time and allowing them to focus on more important aspects of their work. And internationally, technology has the potential to reach those who don’t have access to a classroom. In 2015 the British Council used Skype to deliver teacher training in Libya; and as far back as 1999 Sugata Mitra created “Hole in the Wall” schools by placing computers in slums in Delhi.

The best teachers will use technology in the classroom as part of an expanding toolkit, and hopefully they’ll see the benefits of smarter technology in the form of reduced clerical work. Classrooms will continue to change shape, but it’s safe to assume that there will be a human teacher at the front of them for a long time yet.