Donella Meadows
Review
This book has the most captivating opening chapter I've ever read. In just 25 pages, the author skilfully explains the fundamental concepts of systems thinking using simple language and numerous examples. Moreover, she successfully ignites your enthusiasm for perceiving the world from a fresh perspective. The most talented people I work with are all intuitive systems thinkers. They have the ability to see the big picture and identify high leverage intervention points, which makes them disproportionately effective. If that doesn’t sound like you, then it might be worth reading this book!
You Might Also Like:
Key Takeaways
The 20% that gave me 80% of the value.
- A system is a set of interconnected things that produce their own pattern of behaviour.
- A system is an interconnected set of elements organised to achieve something. Systems have three key components: elements, interconnections, and function or purpose.
- A system is more than the sum of its parts; even simple systems can produce complex behaviours. For example non-biological systems can exhibit attributes like adaptability, goal-seeking tendencies and self-preservation.
- Elements are often the most visible but least influential component of a system. It is often more informative to examine the interconnections and relationships between elements.
- Systems often rely on information exchange (flow of information or signals); this exchange shapes their behaviour. Some interconnections in systems though are actual physical flows.
- Functions or purposes are harder to see but the strongest determinant of the system’s behaviour. Interconnections are also critically important.
- ‘Stocks and Flows’ help us understand behaviour over time. Stocks represent accumulated quantities within a system (material or information). Stocks are altered by flows, and are determined by the balance of inflows and outflows. Stocks don't change instantly; there's a delay based on the rate of the flows. Stocks provide stability within systems, absorbing sudden changes to inflows/outflows.
- Feedback loops are when a change in a stock triggers adjustments in its inflows / outflows, altering the stock itself. Feedback loops can stabilise stock levels, create growth, or cause decline.
- Stabilising Loops (or Balancing Feedback Loops): function to maintain a system within a desired range, ensuring stability and preventing significant changes.
- Runaway Loops (or Reinforcing Feedback Loops) amplify change, leading to either exponential growth or runaway collapse. They work by enhancing the existing direction of change. Examples: Erosion leads to less plant growth, further increasing erosion.
- Feedback isn’t instant, so feedback loops have inherent delays. A flow can’t react instantly to a flow, it can only react to a change in stock.
- In a stock-maintaining balancing feedback loop, it is possible for dominance to shift. When one loop becomes stronger or weaker, this shift can dramatically change the stock. For example: Birth and death rates can impact population growth patterns.
- Systems models are not meant for precise predictions. Instead, they are used to explore different possibilities and "what if" scenarios to understand how the system might behave under various conditions.
- Feedback structures dictate behaviour: Systems with similar underlying feedback structures will exhibit similar behaviours, even if they outwardly seem very different.
- Delays cause oscillations. Delayed responses within balancing feedback loops create cycles of over and under correction (like inventory swinging above and below the target).
- Understanding the nature (information vs. physical) and length of delays is essential for understanding the system's dynamics.
- Relevant delays if you’re managing inventory:
- Perception (averaging sales to gauge trends)
- Response (gradual order adjustments)
- Supply chain (time for orders to arrive)
- All growing physical systems face constraints: No physical system can grow indefinitely. Look for reinforcing loops (driving growth) and balancing loops (limiting growth).
- Nonrenewable Resources (e.g. Natural Gas Field) are subject to depletion dynamics. They are stock limited. Stock can be extracted at any rate (limited by extraction capital). Since the stock is not renewed, the faster the extraction rate the shorter the lifetime of the resource.
- Renewable Resources (e.g. fish) are flow-limited, not stock-limited: Renewable resources can provide indefinitely BUT only up to their natural regeneration rate. Overshoot is possible. Extracting beyond the renewal rate can damage the resource's ability to replenish itself, leading to decline or even collapse. It has three potential outcomes:
- Sustainable equilibrium: Harvest rate matches regeneration.
- Oscillation: Over and under harvesting around the sustainable level.
- Collapse: Resource population is damaged beyond a critical threshold, dragging down the industry.
- The resource's critical threshold for damage and the strength of the feedback loop limiting industry growth in response to resource scarcity determine the system's fate.
- Promoting resilience, self-organisation and hierarchy can improve a systems ability to function well over the long term.
- Resilience: A system's ability to recover and persist within a changing environment. The opposite is brittleness or rigidity. Preserve a systems' inherent restorative abilities by allowing for some flexibility, imposing rigid stability can reduce your ability to respond to shocks.
- Self-Organisation: A system's capacity to generate complexity and new structures from within. Simple rules within a system can give rise to highly diverse and sophisticated structures. Don’t suppress self-organisation in favour of short-term efficiency, as it will reduce your ability to innovate. Foster freedom, some chaos, and experimentation, as these encourage self-organisation.
- Hierarchy: Systems are often organised into nested subsystems (ex: cells -> tissues -> organs). This arrangement is widespread in nature. Hierarchies are prevalent because they allow stability, efficient information flow, and subsystems to thrive while serving the larger whole. Decomposability allows us to study subsystems in isolation, BUT evolving systems will require shifting your thinking with them as they become more integrated.
Systems can be surprising because:
- We often mistake individual events (oil found, forest cut) for the whole explanation of how the world works. Much real-world analysis is confined to superficial event-level explanations (linear cause and effect), which can't help you predict or change system behaviour. It's more useful to look at how events accumulate into patterns of behaviour over time (growth, decline, oscillations, etc.). The behaviour of a system is determined by its underlying structure: stocks, flows, and feedback loops.
- We think linearly, when many systems have non-linear dynamics, where the change in effect is not directly proportional to the change in cause. Nonlinearities in feedback systems can quickly produce shifting dominance of loops and many complexities in system behaviour.
- System boundaries aren’t alway clear. Boundaries are artificial, systems don't have inherent boundaries. We draw them. The right boundary depends entirely on your focus. We need boundaries for clarity, but need to remember they are choices we make, not inherent truths within the system.
- There are layers of limits. Liebig's Law of the Minimum: Growth depends on identifying which resource is currently most limiting. Adding more of a plentiful resource does nothing when another key input is in short supply.
- Delays are everywhere: Every stock embodies a delay (the time between inflow and outflow), and most flows take time to complete (shipping, processing, even information flow isn't truly instantaneous).Decisions based on outdated information lead to responses that are off-target, too big/small relative to current reality. If responses are too quick, we risk amplifying short-term noise and increasing instability rather than making meaningful corrective adjustments. Delays cause dynamic issues like overshoots, oscillations, and even outright collapse often stem from delays within feedback loops. The obvious mitigation to delays is foresight. When long delays exist, the ability to anticipate changes through awareness of those delays becomes critical for taking proactive action.
- Bounded Rationality. We make reasonable decisions with the limited information we have. We can't be perfect optimisers because we lack complete knowledge and a full view of the system, especially the distant parts. To overcome these limits, we must address the underlying system itself. By bettering the information flow and overall structure, we can make decisions that serve the whole system, not just our isolated positions within it.
Traps and opportunities:
- Policy Resistance: can arise when different actors pull a system stock toward differing goals. New policies, even if effective, pulls the stock further from the goals of the other actors and produces additional resistance.
- Way out: Encourage actors to channel the energy they put into resistance into a larger more important goal everyone can pull toward, find a way for everyone’s goals to be realised.
- Trap: Tragedy of the commons. When everyone benefits from the use of a shared resource, but also shares the cost of its abuse.
- Way out: Educate users so they understand the consequences of abusing the resource, and create a feedback loop that corrects for overuse (e.g. limiting use or privatisation).
- Trap: Success to the Successful: Winners of a competition are rewarded with the means to win again, causing a reinforcing feedback loop where they win perpetually.
- Way out: Losers can play different games. Limit the rewards to the winner. Policies that level the playing field (e.g. handicaps). Policies that devise rewards for success that do not bias the next round of the competition (e.g. spending caps in F1).
- Trap: Shifting of the burden to the intervenor: Quick fixes mask deeper issues, solutions that only address the symptoms of a problem don't actually fix the root cause. These "band-aid" solutions can become addictive, as more and more is needed to cover up the same problem.
- Way out: Avoid getting into the loop, avoid symptom-relieving or signal-denying policies that don’t address the problem. Focus on long-term restructuring not short term relief.
- Trap: Rule Beating: Pervasive behaviour of rule beating dives the appearance of obeying the tules or achieving the goals but distorts the system
- Way out: Design rules to release creativity not in the direction of beating the rules, but in the purpose of achieving the rules.
Places to intervene in a system. Ranked from least to most effective:
- Changing numbers (constants and parameters) is often the least effective way to intervene in a system.
- Changing the size buffers (stocks relative to flows) can impact system stability.
- Stock-and-flow structures (physical systems and nodes of intersection) play a significant role in system operation.
- Delays in how a system gets information about itself can make it unstable.
- Balancing feedback loops are essential for a system to self-correct.
- Reinforcing feedback loops drive one-way growth and can lead to system collapse if unchecked.
- Information flows, including access to information, can have a powerful impact on system behavior.
- Rules, including incentives and constraints, shape a system's scope and behavior.
- Self-organization allows a system to add, change, or evolve its structure.
- Goals define the purpose or function of the system and can be leveraged for change.
- Paradigms, or shared beliefs, underlie a system and can be influential in driving its behavior.
- Transcending paradigms and staying flexible can lead to enlightenment and the ability to choose one's own path.
- Systems thinking teaches us of it’s own limitations. Self-organising, non-linear, feedback systems are unpredictable, and hard to control. We can only understand them at the surface level, we can’t use that understanding to predict the future.
Systems wisdoms:
- To understand a system, observe its behaviour over time before intervening, focusing on data, not assumptions.
- Making mental models explicit through diagrams or equations enhances clarity and forces us to confront our assumptions.
- Information integrity is crucial for system functionality. Transparent information is a powerful agent for change.
- Language shapes our understanding and our realities, influencing the way we interact with and perceive the world.
- We have a bias for focusing on what we can measure, which can distort our understanding of what’s truly important.
- Don’t maximise parts of the system or sub-systems while ignoring the whole.
- Don’t be an unthinking intervenor, don’t destroy the systems own self-maintenance capacities. Pay attention to the value of what’s already there.
- ‘Intrinsic responsibility’ in systems design ensures decision-makers feel the consequences of their choices, fostering accountability. Systems often fail to hold individuals accountable.
- Stay humble, stay a learner. Don’t trust your intuition, but trust yourself to be able to figure it out. If you don’t know: Don’t bluff. Don’t freeze. Learn. Learn through experimentation and embrace your errors.
- Follow a system wherever it leads, you’ll have to cross the boundaries of traditional disciplines and take an interdisciplinary approach.
Deep Summary
Longer form notes, typically condensed, reworded and de-duplicated.
Introduction: The System Lens
Managers are not confronted with problems that are independent of each other, but with dynamic situations that consist of complex systems of changing problems that interact with each other. I call such situations messes.… Managers do not solve problems, they manage messes. Russell Ackoff
- Systems thinking helps us identify the root causes of problems and find new opportunities.
- A system is a set of interconnected things that produce their own pattern of behaviour.
- Complex systems are often resistant to change despite our efforts.
- Systems thinking offers a new perspective to solve complex problems. Systems thinking can compliment more traditional reductionist thinking (linear reasoning, e.g. from cause to effect)
- Systems thinking is widely applicable, but a few common structures, problems and behaviours called archetypes exist.
Part 1: System Structure and Behaviour
Chapter 1: The Basics
- A system is more than the sum of it’s parts
- A system is an interconnected set of elements organised to achieve something.
- Systems have three key components: elements, interconnections, and function or purpose.
- Systems can be embedded within other systems.
- A system is more than the sum of its parts → complex behaviour can emerge (e.g. adaptive, dynamic, goal-seeking, self-preserving, evolutionary).
- There is an integrity or wholeness about a system.
- Non-biological systems can exhibit attributes associated with biological systems, such as adaptability, dynamic behaviour, goal-seeking tendencies, self-preservation, and occasionally evolutionary behaviour.
- Look beyond the visible elements of a system to understand what’s happening.
- Elements are often the most visible but least influential component of a system. It is often more informative to examine the interconnections and relationships between elements.
- Systems often rely on information exchange (flow of information or signals); this exchange shapes their behaviour. Some interconnections in systems though are actual physical flows.
- Functions or purposes are harder to see, the best way to deduce the system’s purpose is to watch for a while to see how the system behaves.
- Example: the purpose of a national economy is to grow larger.
- There can be purposes within purposes. Keeping sub-purposes aligned to parent system purposes is an essential function of successful systems.
- The purpose is the strongest determinant of the system’s behaviour. Interconnections are also critically important. Changing relationships usually changes system behaviour too.
- Stocks and Flows: Understanding behaviour over time is key to Systems Thinking.
- Stocks: Represent accumulated quantities within a system (material or information). They reflect the system's history over time.
- Stocks change: They are altered by flows, serving as the system's "memory" of past change.
- Stock levels are determined by the balance of inflows and outflows. Inflows increase the stock, outflows decrease it.
- You can increase a stock by increasing inflows or decreasing outflows.
- Stocks don't change instantly; there's a delay based on the rate of the flows.
- Stocks provide stability within systems, absorbing sudden changes to inflows/outflows.
- Systems thinking focuses on stocks and the feedback processes that regulate their levels.
- How the System Runs Itself: Feedback
- Persistent behaviour in a stock (growing, declining, stable) often signals a control mechanism at work.
- Feedback loops: Changes in a stock trigger adjustments in its inflows/outflows, altering the stock itself.
- Feedback loops can stabilise stock levels, create growth, or cause decline.
- Feedback loops involve: a stock, decisions/rules/physical laws/ actions based on the stock's level, and a flow that changes the stock.
- Stabilising Loops (also known as Balancing Feedback Loops): These loops function to maintain a system within a desired range, ensuring stability and preventing significant changes. Balancing loops may not always be perfect, as delays, unclear information, or insufficient resources can limit their effectiveness.
- Runaway Loops / Reinforcing Feedback: These loops amplify change, leading to either exponential growth or runaway collapse. They work by enhancing the existing direction of change.
- Examples: Erosion leads to less plant growth, further increasing erosion. Population growth fuels further population growth.
- Most human decisions involve feedback loops, where we consider current information to modify actions.
- A focusing on feedback loops might shift our perspective from blaming elements to understanding how the system itself creates its own behaviour.
Chapter 2: A Brief Visit to the Systems Zoo
- Balancing feedback loops aim to keep a system stable by counteracting change.
- Example: Thermostat/furnace system. A single stock with two competing balancing feedback loops (furnace warms, outside air cools). A thermostat turns heating on/off to maintain a desired temperature.
- Feedback isn’t instant, so feedback loops have inherent delays. A flow can’t react instantly to a flow, it can only react to a change in stock. Actions taken now will only impact the system later, so planning needs to account for changes happening during the delay.
- In a stock-maintaining balancing feedback loop you must set the goal with inflows and outflows in mind. If you don't account for changes (like outgoing inventory while orders are pending) you won't reach your target stock level.
- Example: Global population is modelled as a single stock influenced by two key feedback loops: the birth rate (reinforcing) and the death rate (balancing).
- Exponential growth or decline: When the birth rate exceeds the death rate, a reinforcing loop dominates leading to exponential population growth. If the death rate is higher, it would mean exponential decline.
- A stable population occurs when the reinforcing and balancing loops are of equal strength.
- Shifting dominance is common. Factors affecting birth and death rates can change over time. As one loop becomes stronger or weaker, this shift dramatically changes population growth patterns.
- Three critical questions for model evaluation:
- Are the model's driving factors plausible?
- Does the model accurately reflect how a real system would respond to those factors?
- What deeper systemic forces are influencing those driving factors?
- Systems models are not meant for precise predictions. Instead, they are used to explore different possibilities and "what if" scenarios to understand how the system might behave under various conditions. A model doesn't have to perfectly predict the future to be valuable. What matters is whether it accurately captures the underlying behavior patterns of the system
- Managing Stocks: There are two ways to increase a stock within a system:
- Increase inflow (ex: more cars supplied to a dealer)
- Decrease outflow (ex: cars being sold less frequently)
- Feedback structures dictate behaviour: Systems with similar underlying feedback structures will exhibit similar behaviours, even if they outwardly seem very different (ex: population growth patterns have similarities to certain economic models).
- Delays cause oscillations. Delayed responses within balancing feedback loops create cycles of over and under correction (like inventory swinging above and below the target).
- Delays are common in real-world systems and strongly influence their behavior.
- Understanding the nature (information vs. physical) and length of delays is essential for understanding the system's dynamics.
- Relevant delays if you’re managing inventory:
- Perception (averaging sales to gauge trends)
- Response (gradual order adjustments)
- Supply chain (time for orders to arrive)
- Limits to Growth. All growing physical systems face constraints: No physical system can grow indefinitely. Look for reinforcing loops (driving growth) and balancing loops (limiting growth).
- The ‘Limits to Growth’ archetype applies to things like populations, bank accounts, or product sales, all of which will eventually saturate.
- Nonrenewable Resources (e.g. Natural Gas Field) are subject to depletion dynamics. Non-renewable resources are stock limited. Stock can be extracted at any rate (limited by extraction capital). Since the stock is not renewed, the faster the extraction rate the shorter the lifetime of the resource.
- The bigger the initial stock, the harder the fall: Larger resource deposits support higher, faster growth in extraction capital, ultimately leading to a more rapid and drastic decline.
- Renewable Resources (e.g. fish) are flow-limited, not stock-limited: Renewable resources can provide indefinitely BUT only up to their natural regeneration rate.
- Overshoot is possible: Extracting beyond the renewal rate can damage the resource's ability to replenish itself, leading to decline or even collapse.
- Three potential outcomes:
- Sustainable equilibrium: Harvest rate matches regeneration
- Oscillation: Over and under harvesting around the sustainable level
- Collapse: Resource population is damaged beyond a critical threshold, dragging down the industry.
- The resource's critical threshold for damage and the strength of the feedback loop limiting industry growth in response to resource scarcity determine the system's fate.
Part 2 Systems and Us
Chapter 3: Why Systems Work So Well
- Promoting resilience, self-organisation and hierarchy can improve a systems ability to function well over the long term.
- Resilience: A system's ability to recover and persist within a changing environment. The opposite is brittleness or rigidity.
- Example: The human body's immune system.
- Even resilient systems have limits beyond which they cannot restore themselves.
- Preserve a systems' inherent restorative abilities by allowing for some flexibility, imposing rigid stability can reduce your ability to respond to shocks.
- Self-Organisation: A system's capacity to generate complexity and new structures from within.
- Example: Organisms growing from a single cell.
- Simple rules within a system can give rise to highly diverse and sophisticated structures.
- Don’t suppress self-organisation in favour of short-term efficiency, as it will reduce your ability to innovate. Foster freedom, some chaos, and experimentation, as these encourage self-organisation.
- Hierarchy: Systems are often organised into nested subsystems (ex: cells -> tissues -> organs). This arrangement is widespread in nature.
- Hierarchies are prevalent because they allow stability, efficient information flow, and subsystems to thrive while serving the larger whole.
- Decomposability allows us to study subsystems in isolation, BUT evolving systems will require shifting your thinking with them as they become more integrated.
Chapter 4: Why Systems Surprise Us
- Everything we know is just a model. Our models aid our understanding but they don’t reflect reality and we’re regularly surprised by how they fall short.
Here are a few reasons why systems are surprising:
- We over index on the importance of events:
- We often mistake individual events (oil found, forest cut) for the whole explanation of how the world works. This focuses on momentary outputs with little predictive power.
- It's more useful to look at how events accumulate into patterns of behaviour over time (growth, decline, oscillations, etc.).
- The behaviour of a system is determined by its underlying structure: stocks, flows, and feedback loops. Think of the Slinky – an event (releasing it) leads to behaviour (oscillation) that's dictated by its coil structure.
- Much real-world analysis is confined to superficial event-level explanations (linear cause and effect), which can't help you predict or change system behaviour.
- Example: Economic analysis often gets stuck at behaviour over time (tracking flows like GNP). This misses two key points: Stocks influence flows through feedback. Knowing a nation's capital stock lets you understand production potential better than just the flow of goods. Statistical flow-to-flow correlations are flawed - changes in stocks drive changes in flows, invalidating purely behaviour-based economic models.
- We think linearly, when many systems have non-linear dynamics
- Nonlinearity describes a cause-and-effect relationship where the change in effect is not directly proportional to the change in cause.
- Nonlinearities are extremely common in real-world systems.
- We have a bias for thinking linearly, which can lead to errors and surprises when we interact with complex systems.
- Nonlinearities in feedback systems can quickly produce shifting dominance of loops and many complexities in system behaviour.
- System boundaries aren’t alway clear
- Boundaries are artificial, systems don't have inherent boundaries. We draw them.
- The right boundary depends entirely on your focus.
- We need boundaries for clarity, but need to remember they are choices we make, not inherent truths within the system.
- If boundaries are too narrow you miss vital connections between elements you excluded and the system, leading to surprises.
- If boundaries are too broad, overcomplexity obscures insights relevant to your problem.
- Boundaries aren't static.The right boundary isn't fixed by discipline or geography. It should shift depending on the evolving problem/questions you're tackling.
- There are layers of limits
- We think in terms of simple cause-effect relationships, but systems are full of multiple interacting causes and effects, constantly limited by various factors.
- Liebig's Law of the Minimum: Growth depends on identifying which resource is currently most limiting. Adding more of a plentiful resource does nothing when another key input is in short supply.
- Shifting limits: Growth itself alters what the limiting factor is. Initially a companies growth might be limited by demand, but a successful campaign could easily move the limit to production capacity.
- Success therefore hinges on being able to foresee the upcoming limitation before it fully binds, then adjust the system/plans accordingly.
- As a system grows, it interacts with the factors limiting it, creating a dynamically evolving situation.
- There will always be intrinsic limits to growth. Sometimes the system can self-regulate (internal limits) and if it doesn't, external constraints will ultimately enforce limitations.
- Delays are ubiquitous
- Delays are everywhere: Every stock embodies a delay (the time between inflow and outflow), and most flows take time to complete (shipping, processing, even information flow isn't truly instantaneous).
- Which delays are worth focusing on depends on your objective. To understand short-term fluctuations, small delays might be crucial, while long-term trend analysis may consider a multi-week delay negligible.
- Delays Drive Behaviour: Adjusting a delay in a system (for better or worse) has huge impacts on how the system functions. Delays are policy leverage points.
- Delay problems:
- Decisions based on outdated information lead to responses that are off-target, too big/small relative to current reality.
- If responses are too quick, we risk amplifying short-term noise and increasing instability rather than making meaningful corrective adjustments.
- Delays cause dynamic issues like overshoots, oscillations, and even outright collapse often stem from delays within feedback loops.
- The obvious mitigation to delays is foresight. When long delays exist, the ability to anticipate changes through awareness of those delays becomes critical for taking proactive action.
- Bounded Rationality
- We make reasonable decisions with the limited information we have. We can't be perfect optimisers because we lack complete knowledge and a full view of the system, especially the distant parts.
- We are not omniscient, rational optimisers. We are satisfiers, we attempt to satisfy our needs and move on to the next decision. We rarely see the full range of possibilities before us. We often don’t foresee the impacts of our actions on the whole system.
- Merely swapping out people within a system doesn't fix this.
- To overcome these limits, we must address the underlying system itself. By bettering the information flow and overall structure, we can make decisions that serve the whole system, not just our isolated positions within it.
When we think in terms of systems, we see that a fundamental misconception is embedded in the popular term “side-effects.”… This phrase means roughly “effects which I hadn’t foreseen or don’t want to think about.”… Side-effects no more deserve the adjective “side” than does the “principal” effect. It is hard to think in terms of systems, and we eagerly warp our language to protect ourselves from the necessity of doing so. Garrett Hardin, ecologist
Chapter 5: System Traps and Opportunities
- Some system archetypes are perverse. The destruction they cause is often attributed to actors or events, but it’s often a consequence of the system structure. However there are ways to escape the traps, by reformulating goals, weakening or strengthening feedback looks or adding new ones.
- Trap: Policy Resistance: can arise when different actors pull a system stock toward differing goals. New policies, even if effective, pulls the stock further from the goals of the other actors and produces additional resistance.
- Example: Wars on drugs, which don’t work.
- Way out: Encourage actors to channel the energy they put into resistance into a larger more important goal everyone can pull toward, find a way for everyone’s goals to be realised.
- Example: Ending prohibition in the US.
- Trap: Tragedy of the commons. When everyone benefits from the use of a shared resource, but also shares the cost of its abuse. There’s little feedback form the condition of the resource to the decisions of the resource users, resulting in overuse, and ultimately a collapse of the resource.
- Way out: Educate users so they understand the consequences of abusing the resource, and create a feedback loop that corrects for overuse (e.g. limiting use or privatisation).
- Trap: Drift to low performance.There’s a distinction between the actual system and the perceived state. Performance standards are influenced by past performance, sets up a reinforcing feedback loop of eroding goals that pushes a system toward low performance.
- Way out: Keep performance standards absolute. Make goals sensitive to the best performance of the past, not the worst. Have the mindset of: "The better things get, the harder I'm going to work to make them even better.”
- Trap: Escalation. When the state of one stock, is determined by trying to surpass the state of another stock. A reinforcing feedback loop of escalating loudness and violence. Leads to extreme behaviour quickly, the spiral is stopped by someone’s collapse.
- Way out: Avoid escalation by refusing to compete and interrupting the feedback loop. Negotiate a new system with balancing loops to control the escalation.
- Trap: Success to the Successful: Winners of a competition are rewarded with the means to win again, causing a reinforcing feedback loop where they win perpetually.
- Way out: Losers can play different games. Limit the rewards to the winner. Policies that level the playing field (e.g. handicaps). Policies that devise rewards for success that do not bias the next round of the competition (e.g. spending caps in F1).
- Trap: Shifting of the burden to the intervenor: Quick fixes mask deeper issues, solutions that only address the symptoms of a problem don't actually fix the root cause. These "band-aid" solutions can become addictive, as more and more is needed to cover up the same problem. Increased dependence on the quick fix weakens the system's ability to fix itself naturally. A spiral of decline occurs, the problem gets worse, more of the quick fix is needed, and the system deteriorates further.
- Way out: Avoid getting into the loop, avoid symptom-relieving or signal-denying policies that don’t address the problem. Focus on long-term restructuring not short term relief.
- Example: modern medicine has shifted the responsibility away from individuals and their lifestyle onto intervening doctors and medicine.
- Trap: Rule Beating: Pervasive behaviour of rule beating dives the appearance of obeying the tules or achieving the goals but distorts the system
- Way out: Design rules to release creativity not in the direction of beating the rules, but in the purpose of achieving the rules.
- Example: Banning development on land that contains endangered species can resort in actors deliberately killing or displacing them so they can develop the land.
- Trap: Seeking the wrong goal: behaviour is sensitive to the goals of feedback loops. If rules or goals are defined badly a system might obediently work to produce a result that isn’t intended.
- Way out: Specify the indicators and goals that reflect the real welfare of the system. Don’t confuse effort with result, as you’ll end up with a system that produces effort, not result.
Chapter 6: Leverage Points
- As change makers we should aim to look to identify places in the system where a small change can lead to a big shift in behaviour.
Here’s a list of places to intervene in a system. Ranked from least to most effective:
- Changing numbers (constants and parameters): is often the least effective way to intervene in a system. While we care deeply about these variables and fight over them, small changes rarely produce the desired outcome. In stagnant systems, parameter changes rarely kick-start activity. In variable systems, they usually don't stabilise things. In out-of-control systems, they don't slow things down. Parameters can be effective if they trigger higher-level interventions, but most systems avoid reaching critical parameter ranges. Overall, focusing on numbers is often not worth the effort
- Changing the size buffers (stocks relative to flows): Rivers flood more than lakes because big stocks are more stable than small ones. Buffers stabilise systems, but if they are too big, the system becomes inflexible. While changing the size of buffers can be leveraged, buffers are often physical entities that are not easily changed. Therefore, buffers are not high on the list of leverage points.
- Stock-and-flow structures (physical systems and nodes of intersection): The physical arrangement of a system can have a big impact on its operation. The Hungarian road system, for example, causes increased traffic and pollution by routing all traffic through central Budapest. Fixing a poorly designed system usually requires expensive and slow rebuilding. Certain stock-and-flow structures cannot be changed. Although physical structure is important, it is not usually a quick or simple way to make changes. The key lies in proper design, understanding limitations and bottlenecks, maximising efficiency, and avoiding overloading capacity.
- Delays: lengths of time relative to rates of system change: Delays in how a system gets information about itself can make it unstable. Think of it like trying to drive with really bad lag on your steering. This lag can make a system wobble and overcorrect, or miss its target entirely. Examples include delayed information causing stores to have too much or too little stock, or delays in building power plants making it hard to meet changing energy needs. Rather than trying to make delays vanish, it's better to purposefully slow down how fast a system can change so that technology and pricing have time to catch up.
- Balancing Feedback Loops: The strength of the feedbacks relative to the impacts they are trying to correct. Balancing feedback loops are essential for a system to self-correct. They work by comparing a goal state against the current situation and triggering actions to minimize the difference. Systems depend on both everyday balancing loops and less frequently used "emergency" ones to survive under stressful conditions. It's vital to preserve these mechanisms, even if they seem unused and costly in the short term. Markets are powerful balancing mechanisms, but only when prices clearly convey accurate information. Governments and corporations distort markets through subsidies and misinformation, which weakens the market's self-correcting ability. True market efficiency lies in maintaining clarity of information and a fair playing field. Democracy is another critical balancing loop, keeping powers in check. Unfortunately, vast sums of money manipulate information flow between governments and voters, eroding this essential democratic feedback. To handle increased impacts, feedback loops must be correspondingly strengthened. Examples include using natural pest control in agriculture, transparency in government through mechanisms like the Freedom of Information Act, and regulations to internalise hidden costs like pollution.
- Reinforcing Feedback Loops: The strength of the gain of driving loops. Balancing feedback loops correct system behaviour, while reinforcing loops drive one-way growth, potentially leading to system collapse if unchecked. Reinforcing loops—like unchecked population growth—can be detrimental, and are best managed by balancing loops, which act as a system's brakes. In society, these loops manifest in wealth accumulation, where the rich get richer, creating a need for balancing measures such as progressive taxes and public education to ensure equitable growth. Effective system management involves identifying and adjusting leverage points like birth rates and erosion rates, rather than allowing reinforcing loops to exacerbate disparities.
- Information flows: Who has access to information. Electricity consumption drops if meters are located where they’re more visible. Missing information flows are a common causes of system malfunction. Adding or restoring information flows is a powerful intervention. Missing feedback should be restored to the right place in a compelling way. Humans have a tendency to avoid accountability for their decisions.
- Rules - Incentives, punishments and constraints. The rules of a system define its scope, boundaries, and freedoms. When Gorbachev came to power in the Soviet Union, he implemented transparency (glasnost) and changed the economic rules (perestroika), resulting in significant change. Rules have a profound impact on our behavior and are sources of great power. This is why lobbyists are drawn to Congress. To understand the fundamental flaws of systems, focus on the rules and who controls them.
- Self-Organisation - The power to add, change or evolve system structure. The most remarkable thing systems can do is to change themselves by creating new structures and behaviours. A system that can evolve can survive almost any change, by changing itself. We often look at self-organisation with wonder, but a few clever rules for self-organisation can have amazing results (the rules govern how, where and what they system can add or subtract from itself under what conditions). Self-organisation is a matter of evolutionary agility, and a means for experimentation for selecting and testing new patterns. Encouraging variability, experimentation and diversity increases the power of self-organisation but means relinquishing control too.
- Goals - The purpose or function of the system. Technology isn’t good or bad, it depends who’s using it and for what goal. The goal is the important mechanism, and they are important leverage points. Make the goal clear, make sure everyone agrees on it. The leader of an organisation has huge power, as they can change the goal. Articulating, meaning, repeating, standing up for and insisting upon new system goals is a powerful lever.
- Paradigms - the mind-set out of which the system, its goals, structure, rules, delays, parameters arise. There are shared ideas in the minds of society, unstated assumptions that constitute a society’s beliefs. Everybody knows them. Paradigms are the sources of systems, from them comes everything else. The ancient Egyptians built pyramids because they believed in the afterlife. Paradigms are hugely influential, and there’s although they’re hard to change there’s no real blockers. To change a paradigm you highlight the anomalies and failures of the old ones, and speak and act loudly and with assurance from the new one.
- Transcending Paradigms. Keep yourself unattached in the arena of paradigms, stay flexible and realise that no one paradigm is true. Let go of not-knowing into a state of enlightenment. If no paradigm is right, you can choose your own.
Chapter 7: Living in a world of systems
- Systems thinking teaches us of it’s own limitations. Self-organising, non-linear, feedback systems are unpredictable, and hard to control. We can only understand them at the surface level, we can’t use that understanding to predict the future.
- Systems thinking can lead to the illusion of control. In truth we can’t control them, but we can dance with them.
- Systems wisdoms:
- To understand a system, observe its behaviour over time before intervening, focusing on data, not assumptions. Misconceptions are common, analysing behavior dynamically reveals the interplay of its elements and the system’s strengths. This helps avoid jumping to solutions based on favoured theories or desires, without truly understanding the system’s operations and underlying dynamics.
- Making mental models explicit through diagrams or equations enhances clarity and forces us to confront our assumptions. It invites scrutiny and alternative perspectives, fostering mental flexibility and the ability to adapt to new system modes. Expose, test, and refine models. Follow the scientific method.
- Information integrity is crucial for system functionality. Transparent information is a powerful agent for change, and its manipulation by those in control of information streams, like the media and politicians, can have profound effects on system behaviour and societal outcomes. Misinformation can lead to feedback loop failures.
- Language shapes our understanding and our realities, influencing the way we interact with and perceive the world. A society's vocabulary reflects its values and capacity for action. The misuse of language, with terms like collateral damage’ reveals a society's moral compass and direction. Clear, truthful language is vital for meaningful communication.
- We have a bias for focusing on what we can measure, which can distort our understanding of what’s truly important.It's important to recognise and value qualitative aspects, speaking up for them and acknowledging their impact, even if they can't be measured.
- Don’t maximise parts of the system or sub-systems while ignoring the whole.
- Don’t be an unthinking intervenor, don’t destroy the systems own self-maintenance capacities. Pay attention to the value of what’s already there.
- ‘Intrinsic responsibility’ in systems design ensures decision-makers feel the consequences of their choices, fostering accountability. Systems often fail to hold individuals accountable, such as when political or military leaders make decisions without facing direct consequences, diluting the sense of responsibility and its effects on action.
- Stay humble, stay a learner. Don’t trust your intuition, but trust yourself to be able to figure it out. If you don’t know: Don’t bluff. Don’t freeze. Learn. Learn through experimentation and embrace your errors.
- Follow a system wherever it leads, you’ll have to cross the boundaries of traditional disciplines and take an interdisciplinary approach.
- Systems theory suggests there's no separation between short-term and long-term effects since actions today can have enduring consequences. Balancing attention between immediate steps and long-term paths is crucial, much like cautiously navigating a complex trail requires awareness of both the ground beneath your feet and the destination ahead.
- The media has the effect of popularising and normalising bad behaviour. Make a conscious effort to uphold and act on moral standards.