🤔 Thinking In Systems Notes 🤔

A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time. 

Using ecology as the best example of complex systems, a fundamental understanding of how interconnected the world is starts to come to light.

Overall this book offered a new perspective to looking at the world and understanding nonlinearity. 

I know this may sound like a fancy math word but the world is inherently nonlinear yet everything most of us have been taught since we were children was that the world was conveniently linear. 

Problem is it isn’t.

Understanding that fact will give you more appreciation for life, business and the miracle of being alive. 

👇 Here are a couple of interesting things I learned from Thinking In Systems 👇

  1. Casual Opacity – This is something I found very interesting in Nassim Taleb’s books (specifically Fooled By Randomness). Assuming that you know what caused event B is a fool’s erand. In the real world the causes of an outcome are multivariant and often result in huge delays in interconnected feedback loops. 
  2. Second Order Effects Can Be Huge – This comes down to what most people know about Chaos Theory, due to the interconnectedness of systems small changes can lead to large systemic changes much later on. 
  3. Interventionalism is a Dangerous Game – Intervening in complex systems with simple fixes (think economics, medicine or alternative therapies) often causes more harm than good.
  4. Static Growth > Dynamic Growth – There is a strong bias in society, life and economics against self organization which often produces heterogeneity and unpredictability. Much of society is built around the principle of keeping people in place for the sake of stability and productivity. This isn’t on purpose or sinister, it’s just an emergent property of a system that values static growth over dynamic growth. antifragility > fragile systems

Below the book broken down in point form in relevant sections. 

But before we start…remember, always, that everything you know, and everything everyone knows, is only a model. The real world is too complex to understand. 

We can’t control systems or figure them out. But we can dance with them.  🕺🕺

Understanding Causality 🎯

  • Systems themselves cause their own inherent behaviour. They are not caused by outside forces. We just screw up the causality.  Example – The flu virus does not attack you; you set up the conditions for it to flourish within you
  • Another –  Competitors rarely cause a company to lose market share. They may be there to scoop up the advantage, but the losing company creates its losses at least in part through its own business policies.
  • Because of feedback delays in complex systems it’s often very difficult to solve at that point
  • Complex systems lack causal opacity = most important part of the book
  • Since the scientific revolution, our strong emphasis has made us think the problem with many things is “out there” rather than “in here” = overly simplifying causality Instead of looking at individual elements it’s important to look at interconnections between various things. It’s easier to learn about a system’s elements as they are observable than its inconnections.
  • If A causes B, what if B also causes A?
  • Systems can have multiple feedback loops which pull them in various directions.
  • Our models are always limited as we don’t have the cognitive capacity to keep so many variables in our heads. We are running an outdated computer trying to process a TON of information. This is why preexisting beliefs and established ways of thinking are so attractive. They make the world linear.
  • Why are non linear systems so important – > they change the relative strengths of feedback loops. One subsystem can have unexpected exponential affects 

Complex System Delays 🐢

  • We place an overly strong concentration on stocks rather than flows. People often focus too much on inflows and not on outflows and underestimate the time an inflow requires.
  • An optimized system must be considered in terms of its outflows and inflows. There will be a natural loss of outflows so there must be new inflows to maintain equilibrium.
  • Shifting Feedback Dominance – when one feedback loop has a larger influence and over time sways the stock to it’s orientation
  • Feedback loops are sometimes slow to react and changes can cause oscillations in the entire system. This is because elements of a system make predictions for a change and adjust accordingly. Individual components may be adjusting up or down to calibrate themselves
  • Bigger delays in the loop cause larger oscillations.
  • Arguement Against Over Leveraged Actions – > When one tries to fix something in a complex system or anticipate one particular feedback loop and utilizes a strong levered response – > results can be much larger than anticipated
  • Very large systems, with interconnected industries responding to each other through delays, entraining each other in their oscillations, and being amplified by multipliers and speculators, is the primary cause of business cycles. It’s just many many feedback systems all tied together.

Complex System Dynamics  🦠

  • Competitive Exclusion Principle – in a complex system if the winner of a system is given more opportunities to win – it will only reinforce/support the same cycle. In other words, the winner always wins more. 
  • Reductionism – The behavior of a system cannot be known just by knowing the elements of which the system is made. Reductionist arguements don’t work in naturally complex systems.
  • Great Quote – You think that because you understand one that you know one and one make two, but you forget you must know “and”. 
  • Balance feeding loop – these are goal seeking or stability seeking feedback loops
  • Reinforcing Feedback Loop –  enhances whatever direction of change is imposed on it. Reinforcing feedback loops are self enhancing or self destructive. They lead to exponential growth or ruin 
  • Self Organization – the capacity of a system to make its own structure more complex. Like self similarity it often consists of many interconnected webs of feedback loops many built on the same foundations but connected slightly differently – like a snowflake
  • Static Growth > Dynamic Growth – There is a strong bias in society, life and economics against self organization which often produces heterogeneity and unpredictability. Much of society is built around the principle of keeping people in place for the sake of stability and productivity. This isn’t on purpose or sinister, it’s just an emergent property of a system that values static growth over dynamic growth. 
  • Thinking in Models – > Everything we think we know about the world is a model. Every word and every language is a model. All maps and statistics, books and databases, equations and computer programs are models. So are the ways I picture the world in my head—my mental models. None of these is or ever will be the real world.
  • Layers of Limits – Within each system there are limiters that stop it from growing. These can be positive or negative. The disappearance of a biological predator will explode pest populations or lack of production will massively limit company growth. Limits exist in layers and may appear at different times depending on the stock of various subsystems. Takeway here – limits are inhererent in all sufficiently complex systems, although some remain hidden. 

Node/SubSystem Theory 🧮

  • Many inconnections are flows of information or feedback loops between various nodes of a system. EI – in a university students may share info on which course is the easiest to get a good grade in. Information is shared via nodes on the network that in term create second order effects. 
  • Functions and purposes are even harder to determine in a complex system than interconnections. Purposes can only be deduced from observed behaviour.
  • Often subsystems create emergent properties in the complex system as a whole – Ex – drug use can be broken down into many subsystem problems
  • Systems exist within systems. Example – University : students purpose – > get good grades, professor – > get tenure, University – > make tuition fees. The results may not align with the stated purpose of the University as a whole
  • Maintaining subsystem and overall system unity is KEY. If no alignment harmony will be hard to achieve
  • Harmony = Aligning subsystem and overall system objectives. This is why a country united under one purpose is so powerful. 
  • Sometimes a subsystem can become maladaptive (think cancer) and its goals can conflict with the total systems goals. However this maladaptive part always started as a subsystem with the same lower level components
  • Side effects are elements of feedback systems that elude our understanding. They are part of a subsystem and don’t occur without cause, we just don’t understand the feedback loops involved. Nice reframe, side effect translation = we don’t have the ability to understand this complex system. 

 Antifragility vs Fragility 💪

  • Diverse systems are antifragile, limited input systems are more fragile
  • The power of systems is really in resilience – Resilience arises from a rich structure of many feedback loops that can work in different ways to restore a system even after a large perturbation. AKA antifragility
  • Resilence doesn’t mean being static, dynamic systems are often more resilient. Since change is required to keep systems stable sometimes people optimizing for single stock outputs sacrifice resilience for stability undermining the entire system. Example – replacing forests with single crops, inject cattle with growth hormone
  • You can think of resilience as a plateau upon which the system can play, performing its normal functions in safety.
  • Systems need to not be just designed for stability or productivity but for resilience. This is the value of hormesis in daily life. Sauna/Fasting/Exercise 🙂

Society/Economic/Political Feedback Loops 🔵

  • A new leader can evoke change but if the gravity of subsystems are too strong the effectiveness of that change is massively limited
  • Stocks are the foundation of a system. They are a measurement of some set of variables or a single variable.
  • Stocks change via filling or draining or increases or decreases in the said variable. A stock is the present memory of the changes flows within a system
  • If the inflows and outflows are the same you achieve dynamic equilibrium.
  • Cycles don’t come from presidents, although presidents can do much to ease or intensify the optimism of the upturns and the pain of the downturns
  • Each stock can have multiple feedback loops pulling on it from various directions and levels of effort. These can have very unintended consequences when one group exerts a disproportionate amount of effort or sway.  A government policy to stop something can create unexpected and unpleasant second order effects. Think banning abortion – > massive increase in illegal unsafe abortions 

Self Organization  🏗️

  • Probably the most important concept of the book.
  • Static Growth > Dynamic Growth – There is a strong bias in society, life and economics against self organization which often produces heterogeneity and unpredictability. Much of society is built around the principle of keeping people in place for the sake of stability and productivity. This isn’t on purpose or sinister, it’s just an emergent property of a system that values static growth over dynamic growth. 
  • The Battle of Self Organization  – It’s pretty damn hard to suppress self organization in complex systems and human creativity always endures. It’s a natural product of what we are. Introducing the Red Queen Hypothesis
  • Self organizing complexity can produce the most complicated structures and objects. DNA/RNA creates the human body
  • An idea works in a self organizing principle. One small idea can create thousands of dynamic representations that echo out into eternity. Example – the idea of God
  • All the complexity of the world must arise, ultimately, from simple rules
  • Systems are organized into a hierarchy. These are ways of organizing bigger parts made up of smaller similar parts. DNA makes up your cells, which make up your organs which make up you, which makes up group identity 
  • Humans Systems Mimic Biologial Systems – > Corporate systems, military systems, ecological systems, economic systems, living organisms, are arranged in hierarchies. This is a natural evolution of how information is structured in the world
  • Hierarchies are brilliant systems inventions, not only because they give a system stability and resilience, but also because they reduce the amount of information that any part of the system has to keep track of. (lots of caveats to this, don’t shoot me please)
  • Hierarchies are connected stronger to relevant and nearby subsystems. Feedback Loops to higher or lower level subsystems are still there but often have long delays. 
  • Hierarchies function at the lowest levels up. The original purpose of a hierarchy is always to help its originating subsystems do their jobs better. This is something, unfortunately, that both the higher and the lower levels of a greatly articulated hierarchy easily can forget.
  • Self-organizing, nonlinear, feedback systems are inherently unpredictable. They are not controllable. They are understandable only in the most general way. This is why centrally planned anything never works.

Interventionism and Central Node Theory  👨‍⚕️

  • Central Control never works. In complex systems top down control becomes detrimental as feedback and natural function between subsystems breaks. This is why self organizing Capitalism is the best system we’ve had to date. (shots fired here) 
  • BUT some element of self control is needed – To be a highly functional system, hierarchy must balance the welfare, freedoms, and responsibilities of the subsystems and total system—there must be enough central control to achieve coordination toward the large system goal, and enough autonomy to keep all subsystems flourishing, functioning, and self-organizing. Basically, you need an arbiter of truth, justice and equality. 

Adaptive Blindness 🙈

  • Important detail – > You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long-term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, misdesign, or misread systems if you don’t respect their properties of resilience, self-organization, and hierarchy. In short… zooming out instead of zooming in gives you a better perspective. 
  • Most of the world is a black box where we are only able to see some outputs and inputs. The level of complexity is too much for us and causal opacity blinds us into making linear based causality assumptions
  • Short term explanations of complex system based oscillations are attractive because they offer a false sense of understanding for knowledge that can otherwise only be attained by looking at long term patterns. Think – > fear/greed cycle
  • Reductionism is seductive, so is linear thinking. 
  • Behaviour based models mostly triumph event based models. (mostly)
  • There are no boundaries in this world. Let that sink in…. All systems are interconnected. We only create these words, thoughts, perception, and social agreements to allow us to understand an otherwise complex world.
  • Creating these boundaries can create problems that don’t actually exist. There is no single, legitimate boundary to draw around a system. The demarcation of where a system ends and begins is often created apart from reality. Boundaries are idiosyncratic fictions that exist with dynamic boundaries. 
  • We often get attached to boundaries that we have created regarding various systems. These are maintained by social and interpersonal structures.
  • Most of the world is run or ruined by bounded rationality – > people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system.
  • Even with the information we DO have we don’t interpret objectivity. We live in an exaggerated present—we pay too much attention to recent experience and too little attention to the past, focusing on current events rather than long-term behavior.
  • The bounded rationality of each actor in a system may not lead to decisions that further the welfare of the system as a whole. Basically people know different things, so act differently. 
  • The economics of a role will change the actor to be part of that subsystem. This is why people become complacent or act in opposition to their values. Your decisions become part of a feedback loop that you have been placed into. You have no choice. Economics is the master of us all, behavioural and financal. 
  • The Catch 22 of Bounded Rationality – Boundaries are problem-dependent, evanescent, and messy; they are also necessary for organization and clarity. The catch 22 of bureaucratic and organizational processes
  • A better way to fix systems is to not focus on the tail effects and instead focus on bounded rationality. Where is information missing or assumptions made? Example – if population rates aren’t increasing do you ban abortions or improve housing standards/work conditions
  • We can never fully understand our world, not in the way our reductionist science has led

us to expect. Our science itself, from quantum theory to the mathematics of chaos, leads us into irreducible uncertainty. For any objective other than the most trivial, we can’t optimize; we don’t even know what to optimize.

Other Concepts That I Like  ✅

  • Boiled Frog Syndrome – A frog added to cold water will happily stay there until it reaches boiling temperature. Not sure if this is true but it’s a good example of drifting to eroding quality. As perceived measures of feedback go lower and lower so does expected behaviour. It’s a self reinforcing cycle
  • Escalation Trap – a self reinforcing negative loop of behaviour that encourages further achievement or attainment of something. Think nuclear arms, money or status
  • Negative Intervening Loops – > an intervention into a system can cause further weakness in what it was originally built to fix. Example – medicine > health habits 
  • Causal Reductionism: Things rarely happen for just 1 reason. Usually, outcomes result from many causes conspiring together. But our minds cannot process such a complex arrangement, so we tend to ascribe outcomes to single causes, reducing the web of causality to a mere thread.
  • Cumulative Error: Mistakes grow. Beliefs are built on beliefs, so one wrong thought can snowball into a delusional worldview. Likewise, as an inaccuracy is reposted on the web, more is added to it, creating fake news. In our networked age, cumulative errors are the norm.
  • Concept Creep: As a social issue such as racism or sexual harassment becomes rarer, people react by expanding their definition of it, creating the illusion that the issue is actually getting worse
  • Woozle Effect: An article makes a claim without evidence, is then cited by another, which is cited by another, and so on, until the range of citations creates the impression that the claim has evidence, when really all articles are citing the same uncorroborated source.
  • Nirvana Fallacy: When people reject a thing because it compares unfavorably to an ideal that in reality is unattainable. E.g. condemning capitalism due to the superiority of imagined socialism, condemning ruthlessness in war due to imagining humane (but unrealistic) ways to win.
  • Matthew Principle: Advantage begets advantage, leading to social, economic, and cultural oligopolies. The richer you are the easier it is to get even richer, the more recognition a scientist receives for a discovery the more recognition he’ll receive for future discoveries, etc.
  • Peter Principle: People in a hierarchy such as a business or government will be promoted until they suck at their jobs, at which point they will remain where they are. As a result, the world is filled with people who suck at their jobs.
  • Loki’s Wager: Fallacy where someone tries to defend a concept from criticism, or dismiss it as a myth, by unduly claiming it cannot be defined. E.g. “God works in mysterious ways” (god of the gaps), “race is biologically meaningless” (Lewontin’s fallacy).
  • Quantity over Quality – > our society believes that what we can measure is more important than what we can’t measure. It’s easy to measure what is measurable and forget what is important – like values, ethics and personal responsibility.
  • There’s something within the human mind that is attracted to straight lines and not curves, to whole numbers and not fractions, to uniformity and not diversity, and to certainties and not mystery.