Computers and Risk

Extend the idea and what do you have? Senior engineers wandering around power plants? Experienced chefs walking the floors of food manufacturers? Science fiction authors keeping an eye on the robotic factories? It all sounds a bit odd, doesn't it? But perhaps it is what we need to do if we are to escape the grim consequences of leaving it all to the computers.

When Kevin Rodgers states that traders get better with age, he is wholly convincing. His book "Why Aren't They Shouting?" (Random House Business Books 2016) takes you through his career as a banker specialising in foreign exchange and admits you, a privileged observer, to the store of experience he has accumulated over thirty years. There is everything here. In his time at Bankers Trust and then Deutsche Bank he has seen markets boom and break, scandals, triumphs and disasters. All are discussed with extraordinary clarity, against the background of relentlessly developing technology.

Rodgers is a shrewd observer and draws interesting conclusions from what he has seen. Although he comments vividly on the rigging of the LIBOR and FOREX violations, which have made so many headlines in the popular press, these stories are peripheral to his book. The real focus is on how the use of increasingly sophisticated computer technology in the hands of clever and generally honest people led inevitably to the 2008 crash.

At the centre of the crisis was computerised risk analysis which destabilised the market in two ways. First, it led to the introduction of far more complex products, which could for the first time be analysed and assessed in the context of a bank's risk profile. Second, the new algorithms for calculating a bank's overall risk began to dominate corporate thinking so that when those algorithms turned out to be flawed, not in their mathematics but because they could not spot wholly new developments, banks found themselves with unexpected levels of exposure.

Once regulators and bank boards began to believe that computerised risk analysis was fail-safe, the writing was on the wall. Because that analysis was the litmus test against which exposures were measured, its inability to take into account unforeseen events produced inevitable disaster.

This is a fascinating book and I would certainly recommend it, but the lessons are not just about banking. The same lessons apply elsewhere as increasing reliance is placed on computers to keep us safe from the dangers posed by technical development.

When Newton sat under his apple tree at Trinity College Cambridge he must have thought that his theory of classical mechanics was the end of the story. It was consistent. It was elegant. It fitted all the evidence available at the time. And yet he was wrong. As we now know, Newtonian mechanics is an approximation and does not provide an adequate explanation of the laws which apply when things move at speeds approaching that of light. For that you need Einstein's theory of relativity, and in due course that theory will prove to be a simplification of yet more complex rules which will be needed to explain evidence of which we are currently unaware.

That is the way science works. New evidence emerges which requires previous conclusions to be adjusted - in the case of relativity it came from experiments with light. Once the adjustment is made we all sit down with the new theory and imagine it to be complete - until more evidence comes along, of course.

Kevin Rodgers' book makes this point in relation to finance. You can stress test systems against events in the past. You can go further, and test them against your worst imaginings for the future. You cannot test them against the unknown unknowns. That is where the danger lies, and if you rely solely on automated testing alone they will get you in the end.

If you have strong nerves you can think what this means for a whole lot of areas outside banking; the security systems for nuclear power plants for one, systems for controlling weapon systems or robots for another. These are areas where the drive towards more and more sophisticated computerisation is inevitable, not because of laziness of those involved but because the more sophisticated the computer systems the more sophisticated the design of the things which are to be controlled can become. The Victorians relied on over-engineering to make their structures safe, but that hardly fits in with modern notions of efficiency and economy. Imagine the conversation:

Financier: "If we can reduce the casing required around the nuclear core by 10%, we save £100 million."

Engineer: "I'll just test that against the safety algorithm. Yes, that would still leave the risk below the 0.00001% required by government regulations."

Financier: "Brilliant. Shaving that hundred million off should win us the contract. It is not as if there was a real risk either."

Nemesis (off stage): "Something tells me they haven't heard of how the casing would react to the presence of kryptonite."

Ooops!

Yes, it is the inevitability that strikes you. So what should we do about it? Just using more and more computer power doesn't sound like the right answer. In relation to the financial services industry, Mr Rodgers has an interesting idea. "What about a special class of regulators?" he asks. A layer of risk cops, experienced bankers who prowl the building looking into anything that smells as if it is going amiss. Expensive, of course, but then we are told that a lot of the older generation want to work longer so here is a really valuable way in which they could be used. Senior regulators could stalk the banking halls, attending meetings, chatting at coffee machines, etc. Their role would be to pick up the vibes which the computers would miss.

Extend the idea and what do you have? Senior engineers wandering around power plants? Experienced chefs walking the floors of food manufacturers? Science fiction authors keeping an eye on the robotic factories? It all sounds a bit odd, doesn't it? But perhaps it is what we need to do if we are to escape the grim consequences of leaving it all to the computers.

Reproduced from The Shaw Sheet

Close

What's Hot