“[S]ome systems … are very sensitive to their starting conditions, so that a tiny difference in the initial push you give them causes a big difference in where they end up, and there is feedback, so that what a system does affects its own behavior.”
— John Gribbin, Deep Simplicity
Jules Henri Poincaré (1854–1912) described the butterfly effect in 1908:
A very small cause which escapes our notice determines a considerable effect that we cannot fail to see, and then we say the effect is due to chance. If we knew exactly the laws of nature and the situation of the universe at the initial moment, we could predict exactly the situation of that same universe at a succeeding moment. But even if it were the case that the natural laws had no longer any secret for us, we could still only know the initial situation *approximately*. If that enabled us to predict the succeeding situation with *the same approximation*, that is all we require, and we should say that the phenomenon had been predicted, that it is governed by laws. But it is not always so; it may happen that small differences in the initial conditions produce very great ones in the final phenomena. A small error in the former will produce an enormous error in the latter. Prediction becomes impossible, and we have the fortuitous phenomenon
Dynamic adaptive systems are collections of interactions rather than extrapolations of individual interactions. Dynamic systems also become spring loaded — even if we do know know in which interactions will cause them to spring. Interactions yield outputs that become feedback into the overall system. This feedback can be the necessary push to cause a non-linear output that cannot be easily seen or predicted at the start. You can call this a Black Swan if you want but the point is despite our knowledge and systems we have a generally unpredictable response.