Climate, Complexity & Randomness

Climate, Complexity & Randomness

After publishing my last article on the usefulness of climate models, I received some extensive criticism, noting that several definitions and wordings were somewhat blurred. Reason enough, therefore, to devote a little more attention to the topic of climate and complex systems.

Complex systems appear in various forms. Be it within engineering sciences to model complex cycles such as traffic flow, information and communication sciences, which deal with the flow and interaction of information within networks, or political and economic sciences and their respective fields of observation. And, of course, climate sciences are also part of this.
Complex systems are characterised by the fact that they consist of innumerable components that may interact with each other. The characteristics of these interactions include non-linearity (not necessarily B from A, proportionality is not given), adaptivity (the ability to respond to change), emergence (superior, new properties not found in the individual components), feedback loops, and a few others that are not too important for basic understanding[1].

Now these are a lot of buzzwords that can cause some confusion at first reading. For a better illustration an example shall serve.
Complex systems are often characterized by a long-term, inherent robustness against minor interferences. This means that even a large number of smaller problems do not lead to the destruction of the entire system (note, of course, that there is a certain limit and at some point each system will collapse, but more about this later). Just take the Internet. Even if individual providers have network issues, which will be very unpleasant for individual users, the Internet as a global network system will not be threatened and therefore will not collapse. The infrastructure itself is secured by large nodes, so-called Internet Exchange Points, the most important of which is currently located in Frankfurt. Should the latter have technical difficulties to contend with, it is unlikely that the entire system will fail, but much more serious consequences can already be expected. However, if several of these nodes are affected at the same time, we will face a serious problem. The probability of such an event may be low, but it should not be underestimated[2].

It is therefore important to be aware of those dynamics within complex systems that are often referred to as “fat tails”. This term is basically the technical expression of Black Swan events, i.e. rare but highly influential events that often have far-reaching consequences. What most students get to know during their basic statistical education is the discussion of normal distributions and the resulting probabilities of occurrence for rare events. Such events are also referred to as 5 sigma events, i.e. those occurrences whose distance from the mean corresponds to five or more standard deviations. In a normally distributed world, it would follow that these events are highly unlikely. Statisticians such as Mandelbrot[2] and Taleb[3], on the other hand, argue that in a complex world, fat-tail distributions are particularly relevant and normal distributions often underestimate the risk of rare events.

However, not only extreme events can be relevant, but also the cumulative influence of smaller variables. The inherent non-linearity of complex systems can lead to processes that at first glance appear less intuitive. Scott E. Page illustrated this with a very nice example – the algae development of a pond:
Algae are often a consequence of increased phosphate concentrations within a water body. However, it takes a while until a clear pool becomes an algae-polluted pond. The unique characteristic of this is that it is not a gradual change, but a very rapid tilting of the entire system. For a while, the system of the pool can handle the increased phosphate concentration quite well, but at some point there comes a moment when this is no longer the case and the system undergoes a fundamental change in which the algae feel more comfortable than ever before. The gradual increase in the phosphate concentration ultimately leads to a fat tail event that completely throws the original system off course[5].
Now this example is a very simple one and should only serve as an analogy for understanding, because we can explain quite well why this change occurs. In much more complex domains this becomes far more difficult, not to say impossible. Which completes the loop and brings us back to the original topic.

The earth’s climate obviously belongs to the group of complex systems. However, there is also the problem that the behaviour of such systems cannot be predicted exactly, since no model can include all components in its calculations. At this point the climate skeptic feels completely confirmed, because he always knew that these climate scientists and their prognoses cannot be trusted. Without noticing it, however, he falls victim to one of the oldest problems of truth-finding: the induction problem, which I have already discussed here.
The failure of past forecasts is not a reliable indicator that it must always remain that way. In a complex environment, it is impossible to establish obvious causal relationships, but as more potential stressors are added, the risk of causing devastation can increase.

Climate models now face the problem that they are inherently probabilistic, operating on past data and de facto unable to provide fully accurate predictions[6]. However, this is not so much an argument against using the models as it is against the communication strategies derived from them. The media and activists often suggest a certainty of prediction that simply does not exist – but does not have to – in order to demand more reasonable environmental protection strategies.
In the past, there have repeatedly been scenarios in which warnings were given of a possible catastrophe, but which then did not occur. Take, for example, the warning of a population explosion in the 1960s, represented by Paul Ehrlich[7]. He spoke of the extinction of all relevant marine animals around 1980. Obviously, this did not happen because countless factors have developed in a direction that he and other followers of this idea did not see coming. How could they? Such predictions are doomed from the beginning.

But to deduce that there have been various climate episodes in the past, and even if the situation worsens, humanity will be able to develop a new invention that prevents the worst is dangerously naive. In a complex world, it is impossible to make predictions based on past data. It is simply impossible to calculate rare events. A high risk aversion and thus the protection of the environment is the really rational decision. People tend to forget that in complex systems one plus one does not always result in two, but often much more (keyword: non-linearity). Stressors can act as super-additive functions and cause enormous damage.

Climate research itself is very well aware of the difficulty of prognoses within complex systems and articulates them accordingly. Above all the groundwork by Edward Lorenz and the Lorenz system named after him, in popular culture also known as the butterfly effect (ironically, this is often misinterpreted in such a way that one should pay more attention to small details, since their influence could be so great – although countless of them exist and one does not know which are the relevant ones anyway)[8].
The core statement of a Lorenz system is that it is impossible to know all the initial variables within a physical system, from which it follows that the prediction of future behaviour must inevitably fail, even if the system is highly deterministic and quantum effects are ignored.
On the basis of this impossibility, Snyder et al. draw a very meaningful conclusion along the lines of the arguments of Mandelbrot and Taleb. Precisely because making exact predictions is not a realistic option, it is all the more important to develop human systems in a way that ensures their survival even when fat-tail events occur. Above all, this means reducing the number of possible stressors[9].
From this it can be deduced that one should trust the most pessimistic of all models, knowing very well that sometimes things can get even worse, since each one is probably wrong to a certain degree. One can doubt the reliability of accurate forecasts, but this is precisely the reason why one should position oneself extremely conservatively and look for better measures than before. This would have the advantage, especially in public discourse, that one could acknowledge the inadequacy of correct forecasts and still demand better environmental protection measures.
Of course, it is easy to say that not everything will be as bad as some models predict. After all, we have been doing quite well in the past. But this time it is different, because the worst-case scenario is not just a few hundred million deaths, but the complete uninhabitability of the earth for the human race. If not even existential threats force us to change our behaviour, then we do not deserve to survive.

 


Sources

[1] Boeing, Geoff. Visual Analysis of Nonlinear Dynamical Systems: Chaos, Fractals, Self-Similarity and the Limits of Prediction

[2] http://www.drpeering.net/white-papers/Art-Of-Peering-The-IX-Playbook.html

[3] Mandelbrot, B. (1997). Fractals and Scaling in Finance: Discontinuity, Concentration, Risk. Springer

[4] Taleb, N. N. (2007). The Black Swan. Random House and Penguin.

[5] Scott E. Page: Understanding Complexity

[6] https://www.climate.gov/maps-data/primer/climate-models?fbclid=IwAR1sOsZVcE2QcxmXpKGvutmMHuQ73kzcvwrHA8OK4BKzqKC1m4mvkHvxeFg

[7] Ehrlich, Paul R. (1968). The Population Bomb. Ballantine Books.

[8] Lorenz, Edward Norton (1963). “Deterministic nonperiodic flow”. Journal of the Atmospheric Sciences.

[9] Snyder, Carolyn W.; Mastrandrea, Michael D.; Schneider, Stephen H. The Complex Dyanmics of the Climate System: Constraints on our Knowledge, Policy Implications and the Necessity of Systems Thinking. Philosophy of Complex Systems. Volume 10 in Handbook of the Philosophy of Science. 2011. Pages 467-505