Search This Blog

Friday 4 January 2013

How algorithms secretly shape the way we behave


Algorithms, the key ingredients of all significant computer programs, have probably influenced your Christmas shopping and may one day determine how you vote
Srudens
Program or be programmed? Schoolchildren learn to code. Photograph: Alamy
 
Keynes's observation (in his General Theory) that "practical men who believe themselves to be quite exempt from any intellectual influences, are usually the slaves of some defunct economist" needs updating. Replace "economist" with "algorithm". And delete "defunct", because the algorithms that now shape much of our behaviour are anything but defunct. They have probably already influenced your Christmas shopping, for example. They have certainly determined how your pension fund is doing, and whether your application for a mortgage has been successful. And one day they may effectively determine how you vote.

On the face of it, algorithms – "step-by-step procedures for calculations" – seem unlikely candidates for the role of tyrant. Their power comes from the fact that they are the key ingredients of all significant computer programs and the logic embedded in them determines what those programs do. In that sense algorithms are the secret sauce of a computerised world.

And they are secret. Every so often, the veil is lifted when there's a scandal. Last August, for example, a "rogue algorithm" in the computers of a New York stockbroking firm, Knight Capital, embarked on 45 minutes of automated trading that eventually lost its owners $440m before it was stopped.

But, mostly, algorithms do their work quietly in the background. I've just logged on to Amazon to check out a new book on the subject – Automate This: How Algorithms Came to Rule Our World by Christopher Steiner. At the foot of the page Amazon tells me that two other books are "frequently bought together" with Steiner's volume: Nate Silver's The Signal and the Noise and Nassim Nicholas Taleb's Antifragile. This conjunction of interests is the product of an algorithm: no human effort was involved in deciding that someone who is interested in Steiner's book might also be interested in the writings of Silver and Taleb.

But book recommendations are relatively small beer – though I suspect they will have influenced a lot of online shopping at this time of year, as people desperately seek ideas for presents. The most powerful algorithm in the world is PageRank – the one that Google uses to determine the rankings of results from web searches – for the simple reason that, if your site doesn't appear in the first page of results, then effectively it doesn't exist. Not surprisingly, there is a perpetual arms race (euphemistically called search engine optimisation) between Google and people attempting to game PageRank. Periodically, Google tweaks the algorithm and unleashes a wave of nasty surprises across the web as people find that their hitherto modestly successful online niche businesses have suddenly – and unaccountably – disappeared.

PageRank thus gives Google awesome power. And, ever since Lord Acton's time, we know what power does to people – and institutions. So the power of PageRank poses serious regulatory issues for governments. On the one hand, the algorithm is a closely guarded commercial secret – for obvious reasons: if it weren't, then the search engine optimisers would have a field day and all search results would be suspect. On the other hand, because it's secret, we can't be sure that Google isn't skewing results to favour its own commercial interests, as some people allege.

Besides, there's more to power than commercial clout. Many years ago, the sociologist Steven Lukes pointed out that power comes in three varieties: the ability to stop people doing what they want to do; the ability to compel them to do things that they don't want to do: and the ability to shape the way they think. This last is the power that mass media have, which is why the Leveson inquiry was so important.

But, in a way, algorithms also have that power. Take, for example, the one that drives Google News. This was recently subjected to an illuminating analysis by Nick Diakopoulos from the Nieman Journalism Lab. Google claims that its selection of noteworthy news stories is "generated entirely by computer algorithms without human editors. No humans were harmed or even used in the creation of this page."

The implication is that the selection process is somehow more "objective" than a human-mediated one. Diakopoulos takes this cosy assumption apart by examining the way the algorithm works. There's nothing sinister about it, but it highlights the importance of understanding how software works. The choice that faces citizens in a networked world is thus: program or be programmed.

No comments:

Post a Comment