We spent some time in class on Monday discussing risk in its various forms – risks to business, reputational risk, personal risk, natural disaster and so on. One thing we didn’t discuss was the possibility of algorithms taking over the world, but according to some commentators – Kevin Slavin, for instance – it is a pressing concern.
Algorithms are the mathematics of computing, and they silently shape many aspects of our lives, from finance to architecture. Slavin’s point is that through their prevalence we are forced to accommodate ourselves to their often rather daft, if computationally staggering ways of thinking and acting. We start to attune our behaviours to their limitations (think of the way we develop a sort of machine-speak voice when we speak to voice recognition software – we develop a sort of machine-speak voice), and shape our physical environments around their exigencies.
70% of trades on the New York stock exchange, for example, are now driven by algorithms, not by humans. This is a world in which micro-seconds count for millions of dollars, and in order to gain advantages of infinitesimal fractions of seconds many financial companies have started to buy up and hollow out skyscrapers near the main Manhattan Internet super-hub (the Carrier Hotel Building), and fill them with their own servers, making vast capital investments and engaging in absurd engineering projects in order not to be out of the loop.
A company in Boston called Nanex has started to track the silent operations of these algorithms in the market data; when they find them pin visual realisation of them to the wall and give them names (in the same way that viruses might be given names, or hurricanes) – thus we have the Twilight, the Carnival, the Knife, the Boston Shuffler, and so on.
Of course there seems to be a deeply embedded fear in the human psyche that we are a hunted species; we are always looking for the beast stalking us beyond the ring of firelight. But it is unquestionably interesting that algorithms are more now than merely tools – they are independent of hands-on control, if not of their own internal logic, and one of the implications of that is that we have to adapt ourselves and our societies to them, try to understand what they want, which of our actions will please them, without really understanding what they are or what they are after. They are, in all respects, like little gods.
Anyway, here is a short talk on the subject given by Kevin Slavin at TED.