There is common ground in analysing financial systems and ecosystems, especially in the need to identify conditions that dispose a system to be knocked from seeming stability into another, less happy state.
Tipping points‘, ‘thresholds and breakpoints’, ‘regime shifts’ — all are terms that describe the flip of a complex dynamical system from one state to another. For banking and other financial institutions, the Wall Street Crash of 1929 and the Great Depression epitomize such an event. These days, the increasingly complicated and globally interlinked financial markets are no less immune to such system-wide (systemic) threats. Who knows, for instance, how the present concern over sub-prime loans will pan out?
Well before this recent crisis emerged, the US National Academies/National Research Council and the Federal Reserve Bank of New York collaborated on an initiative to “stimulate fresh thinking on systemic risk”. The main event was a high-level conference held in May 2006, which brought together experts from various backgrounds to explore parallels between systemic risk in the financial sector and in selected domains in engineering, ecology and other fields of science. The resulting report was published late last year and makes stimulating reading.
Catastrophic changes in the overall state of a system can ultimately derive from how it is organized — from feedback mechanisms within it, and from linkages that are latent and often unrecognized. The change may be initiated by some obvious external event, such as a war, but is more usually triggered by a seemingly minor happenstance or even an unsubstantial rumour. Once set in motion, however, such changes can become explosive and afterwards will typically exhibit some form of hysteresis, such that recovery is much slower than the collapse. In extreme cases, the changes may be irreversible.
As the report emphasizes, the potential for such large-scale catastrophic failures is widely applicable: for global climate change, as the greenhouse blanket thickens; for ‘ecosystem services’, as species are removed; for fisheries, as stocks are overexploited; and for electrical grids or the Internet, as increasing demands are placed on both. With its eye ultimately on the banking system, the report concentrates on the possibility of finding common principles and lessons learned within this medley of interests. For instance, to what extent can mechanisms that enhance stability against inevitable minor fluctuations, in inflation, interest rates or share price for example, in other contexts perversely predispose towards full-scale collapse?
Two particularly illuminating questions about priorities in risk management emerge from the report. First, how much money is spent on studying systemic risk as compared with that spent on conventional risk management in individual firms? Second, how expensive is a systemic-risk event to a national or global economy (examples being the stock market crash of 1987, or the turmoil of 1998 associated with the Russian loan default, and the subsequent collapse of the hedge fund Long-Term Capital Management)? The answer to the first question is “comparatively very little”; to the second, “hugely expensive”.
An analogous situation exists within fisheries management. For the past half-century, investments in fisheries science have focused on management on a species-by-species basis (analogous to single-firm risk analysis). Especially with collapses of some major fisheries, however, this approach is giving way to the view that such models may be fundamentally incomplete, and that the wider ecosystem and environmental context (by analogy, the full banking and market system) are required for informed decision-making. It is an example of a trend in many areas of applied science acknowledging the need for a larger-system perspective.
But to what extent can study of ecosystems inform the design of financial networks in, for instance, their robustness against perturbation? Ecosystems are robust by virtue of their continued existence. They have survived eons of change — continental drift, climate fluctuations, movement and evolution of constituent species — and show some remarkable constancies in structure that have apparently persisted for hundreds of millions of years: witness, for example, the constancy in predator–prey ratios in different situations. Identifying structural attributes shared by these diverse systems that have survived rare systemic events, or have indeed been shaped by them, could provide clues about which characteristics of complex systems correlate with a high degree of robustness.
An example of this kind emerges from work on the network structure of communities of pollinators and the plants they pollinate. These networks are disassortative, in the sense that highly connected ‘large’ nodes tend to have their connections disproportionately with ‘small’ nodes; conversely, small nodes connect with disproportionately few large ones. The authors show that such disassortative networks tend to confer a significant degree of stability against disturbance. More generally, ecologists and others have long suggested that modularity — the degree to which the nodes of a system can be decoupled into relatively discrete components — can promote robustness. Thus, a basic principle in the management of forest fires and epidemics is that if there is strong interconnection among all elements, a perturbation will encounter nothing to stop it from spreading. But once the system is appropriately compartmentalized — by firebreaks, or vaccination of ‘superspreaders’ — disturbance or risk is more easily countered.
As the report notes, this is a complicated question, because modularity will often involve a trade-off between local and systemic risk. Moreover, the wrong compartmentalization in financial markets could preclude stabilizing feedbacks, such as mechanisms for maintaining liquidity of cash flows through the financial system, where fragmentation leading to illiquidity could actually increase systemic risk (as in the bank runs leading to the Great Depression). Redundancy of components and pathways, in which one can substitute for another, is also a key element in the robustness of complex systems, and effective redundancy is not independent of modularity.
Continue Reading the article
Read the Report