I highly recommend reading this article by Tim Harford (The Undercover Economist) on the similarities between finance and nuclear reactors: they are both complex and tightly coupled.
The connection between banks and nuclear reactors is not obvious to most bankers, nor banking regulators. But to the men and women who study industrial accidents such as Three Mile Island, Deepwater Horizon, Bhopal or the Challenger shuttle – engineers, psychologists and even sociologists – the connection is obvious. James Reason, a psychologist who studies human error in aviation, medicine, shipping and industry, uses the downfall of Barings Bank as a favourite case study. “I used to speak to bankers about risk and accidents and they thought I was talking about people banging their shins,” he told me. “Then they discovered what a risk is. It came with the name of Nick Leeson.”
…Nuclear power stations are both complex and tightly coupled systems. They contain a bewildering array of mechanisms designed to start a nuclear reaction, slow it down, use it to generate heat, use the heat to generate electricity, intervene if there is a problem, or warn operators of odd conditions inside the plant. At Three Mile Island, the most famous nuclear accident in US history, four or five safety systems malfunctioned within the first 13 seconds. Dozens of separate alarms were sounding in the control room. The fundamental story of the accident was that the operators were trying to stabilise a nuclear reactor whose behaviour was, at the time, utterly mysterious. That is the nature of accidents in a complex and tightly coupled system.
Perrow believes that finance is a perfect example of a complex, tightly coupled system. In fact, he says, its complexity “exceeds the complexity of any nuclear plant I ever studied”.
So what can be done?
… It might seem obvious that the way to make a complex system safer is to install some safety measures. Engineers have long known that life is not so simple. In 1638, Galileo described an early example of unintended consequences in engineering. Masons would store stone columns horizontally, lifted off the soil by two piles of stone. The columns often cracked in the middle under their own weight. The “solution” – a third pile of stone in the centre – didn’t help. The two end supports would often settle a little, and the column, balanced like a see-saw on the central pile, would then snap as the ends sagged.
Galileo had found a simple example of a profound point: a new safety measure or reinforcement often introduces unexpected ways for things to go wrong. This was true at Three Mile Island. It was also true during the horrific accident on the Piper Alpha oil and gas platform in 1988, which was aggravated by a safety device designed to prevent vast seawater pumps from starting automatically and killing the rig’s divers. The death toll was 167.
Some interesting books on the subject: Normal Accidents: Living with High Risk Technologies and The Logic of Failure: Recognizing and Avoiding Error in Complex Situations.