Donella Meadows Quotes

We've searched our database for all the quotes and captions related to Donella Meadows. Here they are! All 200 of them:

Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be viewed. Invite others to challenge your assumptions and add their own.
Donella H. Meadows (Thinking In Systems: A Primer)
There are no separate systems. The world is a continuum. Where to draw a boundary around a system depends on the purpose of the discussion.
Donella H. Meadows (Thinking in Systems)
You think that because you understand “one” that you must therefore understand “two” because one and one make two. But you forget that you must also understand “and.
Donella H. Meadows (Thinking in Systems: A Primer)
a system must consist of three kinds of things: elements, interconnections, and a function or purpose.
Donella H. Meadows (Thinking in Systems: A Primer)
We'll go down in history as the first society that wouldn't save itself because it wasn't cost-effective.
Donella H. Meadows
Purposes are deduced from behavior, not from rhetoric or stated goals.
Donella H. Meadows (Thinking in Systems: A Primer)
There is too much bad news to justify complacency. There is too much good news to justify despair.
Donella H. Meadows
We can't impose our will on a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.
Donella H. Meadows (Thinking In Systems: A Primer)
So, what is a system? A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time.
Donella H. Meadows (Thinking in Systems: A Primer)
Addiction is finding a quick and dirty solution to the symptom of the problem, which prevents or distracts one from the harder and longer-term task of solving the real problem.
Donella H. Meadows (Thinking In Systems: A Primer)
A vision should be judged by the clarity of its values, not the clarity of its implementation path [in Mediated Modeling page 43]
Donella H. Meadows
Thou shalt not distort, delay, or withhold information.
Donella H. Meadows (Thinking in Systems: A Primer)
Let's face it, the universe is messy. It is nonlinear, turbulent, and chaotic. It is dynamic. It spends its time in transient behavior on its way to somewhere else, not in mathematically neat equilibria. It self-organizes and evolves. It creates diversity, not uniformity. That's what makes the world interesting, that's what makes it beautiful, and that's what makes it work.
Donella H. Meadows (Thinking In Systems: A Primer)
You maybe able to fool the voters, but not the atmosphere.
Donella H. Meadows
A system* is an interconnected set of elements that is coherently organized in a way that achieves something.
Donella H. Meadows (Thinking in Systems: A Primer)
An important function of almost every system is to ensure its own perpetuation.
Donella H. Meadows (Thinking in Systems: A Primer)
You can drive a system crazy by muddying its information streams.
Donella H. Meadows (Thinking in Systems: A Primer)
People don't need enormous cars; they need admiration and respect. They don't need a constant stream of new clothes; they need to feel that others consider them to be attractive, and they need excitement and variety and beauty. People don't need electronic entertainment; they need something interesting to occupy their minds and emotions. And so forth. Trying to fill real but nonmaterial needs-for identity, community, self-esteem, challenge, love, joy-with material things is to set up an unquenchable appetite for false solutions to never-satisfied longings. A society that allows itself to admit and articulate its nonmaterial human needs, and to find nonmaterial ways to satisfy them, world require much lower material and energy throughputs and would provide much higher levels of human fulfillment.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
If you define the goal of a society as GNP, that society will do its best to produce GNP. It will not produce welfare, equity, justice, or efficiency unless you define a goal and regularly measure and report the state of welfare, equity, justice, or efficiency.
Donella H. Meadows (Thinking In Systems: A Primer)
No one can define or measure justice, democracy, security, freedom, truth, or love. No one can define or measure any value. But if no one speaks up for them, if systems aren’t designed to produce them, if we don’t speak about them and point toward their presence or absence, they will cease to exist.
Donella H. Meadows (Thinking in Systems: A Primer)
Missing information flows is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure.
Donella H. Meadows (Thinking in Systems: A Primer)
Everything we think we know about the world is a model. Our models do have a strong congruence with the world. Our models fall far short of representing the real world fully.
Donella H. Meadows (Thinking in Systems: A Primer)
This ancient Sufi story was told to teach a simple lesson but one that we often ignore: The behavior of a system cannot be known just by knowing the elements of which the system is made.
Donella H. Meadows (Thinking in Systems: A Primer)
A system just can’t respond to short-term changes when it has long term delays. That’s why a massive central-planning system, such as the Soviet Union or General Motors, necessarily functions poorly.
Donella H. Meadows (Thinking in Systems: A Primer)
Because of feedback delays within complex systems, by the time a problem becomes apparent it may be unnecessarily difficult to solve. — A stitch in time saves nine.
Donella H. Meadows (Thinking in Systems: A Primer)
Systems thinkers see the world as a collection of stocks along with the mechanisms for regulating the levels in the stocks by manipulating flows.
Donella H. Meadows (Thinking in Systems: A Primer)
stop looking for who’s to blame; instead you’ll start asking, “What’s the system?” The concept of feedback opens up the idea that a system can cause its own behavior.
Donella H. Meadows (Thinking in Systems: A Primer)
A change in purpose changes a system profoundly, even if every element and interconnection remains the same.
Donella H. Meadows (Thinking in Systems: A Primer)
You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays.
Donella H. Meadows (Thinking in Systems: A Primer)
Once you start listing the elements of a system, there is almost no end to the process. You can divide elements into sub-elements and then sub-sub-elements. Pretty soon you lose sight of the system. As the saying goes, you can’t see the forest for the trees.
Donella H. Meadows (Thinking in Systems: A Primer)
We know a tremendous amount about how the world works, but not nearly enough. Our knowledge is amazing; our ignorance even more so. We can improve our understanding, but we can't make it perfect.
Donella H. Meadows (Thinking In Systems: A Primer)
God grant us the serenity to exercise our bounded rationality freely in the systems that are structured appropriately, the courage to restructure the systems that aren’t, and the wisdom to know the difference!
Donella H. Meadows (Thinking in Systems: A Primer)
Managers do not solve problems, they manage messes. —RUSSELL ACKOFF,
Donella H. Meadows (Thinking in Systems: A Primer)
Sustainability is a new idea to many people, and many find it hard to understand. But all over the world there are people who have entered into the exercise of imagining and bringing into being a sustainable world. They see it as a world to move toward not reluctantly, but joyfully, not with a sense of sacrifice, but a sense of adventure. A sustainable world could be very much better than the one we live in today.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
The system, to a large extent, causes its own behavior! An outside event may may unleash that behavior, but the same outside event applied to a different system is likely to produce a different result.
Donella H. Meadows (Thinking In Systems: A Primer)
We don't think a sustainable society need be stagnant, boring, uniform, or rigid. It need not be, and probably could not be, centrally controlled or authoritarian. It could be a world that has the time, the resources, and the will to correct its mistakes, to innovate, to preserve the fertility of its planetary ecosystems. It could focus on mindfully increasing quality of life rather than on mindlessly expanding material consumption and the physical capital stock.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
Model utility depends not on whether its driving scenarios are realistic (since no one can know that for sure), but on whether it responds with a realistic pattern of behavior.
Donella H. Meadows (Thinking in Systems)
The behavior of a system cannot be known just by knowing the elements of which the system is made.
Donella H. Meadows (Thinking in Systems: A Primer)
A diverse system with multiple pathways and redundancies is more stable and less vulnerable to external shock than a uniform system with little diversity.
Donella H. Meadows (Thinking in Systems: A Primer)
The systems-thinking lens allows us to reclaim our intuition about whole systems and • hone our abilities to understand parts, • see interconnections, • ask “what-if ” questions about possible future behaviors, and • be creative and courageous about system redesign.
Donella H. Meadows (Thinking in Systems: A Primer)
If a factory is torn down but the rationality which produced it is left standing, then that rationality will simply produce another factory. If a revolution destroys a government, but the systematic patterns of thought that produced that government are left intact, then those patterns will repeat themselves. . . . There’s so much talk about the system. And so little understanding. —ROBERT PIRSIG, Zen and the Art of Motorcycle Maintenance
Donella H. Meadows (Thinking in Systems: A Primer)
The central question of economic development is how to keep the reinforcing loop of capital accumulation from growing more slowly than the reinforcing loop of population growth—so that people are getting richer instead of poorer.
Donella H. Meadows (Thinking in Systems: A Primer)
we don’t talk about what we see; we see only what we can talk about
Donella H. Meadows (Thinking in Systems: A Primer)
One of the strangest assumptions of present-day mental models is the idea that world of moderation must be a world of strict, centralized government control. For a sustainable economy, that kind of control is not possible, desirable, or necessary.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
According to the competitive exclusion principle, if a reinforcing feedback loop rewards the winner of a competition with the means to win further competitions, the result will be the elimination of all but a few competitors. The rich get richer and the poor get poorer.
Donella H. Meadows (Thinking In Systems: A Primer)
Don't be stopped by the "if you can't define it and measure it, I don't have to pay attention to it" ploy. No one can define or measure justice, democracy, security, freedom, truth, or love. But if no one speaks up for them, if systems aren't designed to produce them, and point toward their presence or absence, they will cease to exist.
Donella H. Meadows (Thinking In Systems: A Primer)
We can’t impose our will on a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.
Donella H. Meadows (Thinking in Systems: A Primer)
Remember, always, that everything you know, and everything everyone knows, is only a model. Get your model out there where it can be viewed. Invite others to challenge your assumptions and add their own. Instead of becoming a champion for one possible explanation or hypothesis or model, collect as many as possible. Consider all of them to be plausible until you find some evidence that causes you to rule one out. That way you will be emotionally able to see the evidence that rules out an assumption that may become entangled with your own identity.
Donella H. Meadows (Thinking in Systems: A Primer)
The most damaging example of the systems archetype called “drift to low performance” is the process by which modern industrial culture has eroded the goal of morality. The workings of the trap have been classic, and awful to behold. Examples of bad human behavior are held up, magnified by the media, affirmed by the culture, as typical. This is just what you would expect. After all, we’re only human. The far more numerous examples of human goodness are barely noticed. They are “not news.” They are exceptions. Must have been a saint. Can’t expect everyone to behave like that. And so expectations are lowered. The gap between desired behavior and actual behavior narrows. Fewer actions are taken to affirm and instill ideals. The public discourse is full of cynicism. Public leaders are visibly, unrepentantly amoral or immoral and are not held to account. Idealism is ridiculed. Statements of moral belief are suspect. It is much easier to talk about hate in public than to talk about love.
Donella H. Meadows (Thinking in Systems: A Primer)
The tragedy of the commons arises from missing (or too long delayed) feedback from the resource to the growth of the users of that resource.
Donella H. Meadows (Thinking in Systems: A Primer)
It is to “get” at a gut level the paradigm that there are paradigms, and to see that that itself is a paradigm, and to regard that whole realization as devastatingly funny.
Donella H. Meadows (Thinking in Systems: A Primer)
A system is more than the sum of its parts. It may exhibit adaptive, dynamic, goal-seeking, self-preserving, and sometimes evolutionary behavior.
Donella H. Meadows (Thinking in Systems)
One person who was willing to risk political suicide was the visionary systems thinker Donella Meadows—one of the lead authors of the 1972 Limits to Growth report
Kate Raworth (Doughnut Economics: Seven Ways to Think Like a 21st-Century Economist)
The balancing feedback loop that should keep the system state at an acceptable level is overwhelmed by a reinforcing feedback loop heading downhill. The lower the perceived system state, the lower the desired state. The lower the desired state, the less discrepancy, and the less corrective action is taken. The less corrective action, the lower the system state. If this loop is allowed to run unchecked, it can lead to a continuous degradation in the system’s performance.
Donella H. Meadows (Thinking in Systems: A Primer)
When we, system dynamicists, see a pattern persist in many parts of a system over long periods, we assume that it has causes embedded in the feedback loop structure of the system. Running the same system harder or faster will not change the pattern as long as the structure is not revised. Growth as usual has widened the gap between the rich and the poor. Continuing growth as usual will never close that gap. Only changing the structure of the system—the chains of causes and effects—will do that.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
Managers are not confronted with problems that are independent of each other, but with dynamic situations that consist of complex systems of changing problems that interact with each other. I call such situations messes.... Managers do not solve problems, they manage messes. -RUSSELL ACKOFF,' operations theorist
Donella H. Meadows (Thinking in Systems)
The difference between a sustainable society and a present-day economic recession is like the difference between stopping and automobile purposefully with the brakes versus stopping it by crashing into a brick wall. When the present economy overshoots, it turns around too quickly and unexpectedly for people and enterprises to retrain, relocate, and readjust. A deliberate transition to sustainability would take place slowly enough, and with enough forewarning, to that people and businesses could find their places in the new economy.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
But rules for sustainability, like every workable social rule, would be put into place not to destroy freedoms, but to create freedoms or to protect them. A ban on bank robbing inhibits the freedom of the thief in order to assure that everyone else has the freedom to deposit and withdraw money safely. A ban on overuse of a renewable resource or on the generation of a dangerous pollutant protects vital freedoms in a similar way.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
If information-based relationships are hard to see, functions or purposes are even harder. A system’s function or purpose is not necessarily spoken, written, or expressed explicitly, except through the operation of the system. The best way to deduce the system’s purpose is to watch for a while to see how the system behaves.
Donella H. Meadows (Thinking in Systems: A Primer)
The world would be a different place if instead of competing to have the highest per capita GNP, nations competed to have the highest per capita stocks of wealth with the lowest throughput, or the lowest infant mortality, or the greatest political freedom, or the cleanest environment, or the smallest gap between the rich and the poor.
Donella H. Meadows (Thinking in Systems: A Primer)
Pretending that something doesn’t exist if it’s hard to quantify leads to faulty models. You’ve already seen the system trap that comes from setting goals around what is easily measured, rather than around what is important. So
Donella H. Meadows (Thinking in Systems: A Primer)
The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality.
Donella H. Meadows (Thinking in Systems: A Primer)
In physical, exponentially growing systems, there must be at least one reinforcing loop driving the growth and at least one balancing loop constraining the growth, because no physical system can grow forever in a finite environment.
Donella H. Meadows (Thinking in Systems: A Primer)
The idea that there might be limits to growth is for many people impossible to imagine. Limits are politically unmentionable and economically unthinkable. The culture tends to deny the possibility of limits by placing a profound faith in the powers of technology, the workings of a free market, and the growth of the economy as the solution to all problems, even the problems created by growth.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
Nonlinearities are important not only because they confound our expectations about the relationship between action and response. They are even more important because they change the relative strengths of feedback loops. They can flip a system from one mode of behavior to another.
Donella H. Meadows
You could say paradigms are harder to change than anything else about a system, and therefore this item should be lowest on the list, not second-to-highest. But there’s nothing physical or expensive or even slow in the process of paradigm change. In a single individual it can happen in a millisecond. All it takes is a click in the mind, a falling of scales from the eyes, a new way of seeing. Whole societies are another matter—they resist challenges to their paradigms harder than they resist anything else.
Donella H. Meadows (Thinking in Systems: A Primer)
The trick, as with all the behavioral possibilities of complex systems, is to recognize what structures contain which latent behaviors, and what conditions release those behaviors—and, where possible, to arrange the structures and conditions to reduce the probability of destructive behaviors and to encourage the possibility of beneficial ones.
Donella H. Meadows (Thinking in Systems: A Primer)
If we’re to understand anything, we have to simplify, which means we have to make boundaries.
Donella H. Meadows (Thinking in Systems: A Primer)
I have yet to see any problem, however complicated, which, when looked at in the right way, did not become still more complicated.
Donella H. Meadows (Thinking in Systems: A Primer)
Loss of resilience can come as a surprise, because the system usually is paying much more attention to its play than to its playing space.
Donella H. Meadows (Thinking in Systems: A Primer)
most of what goes wrong in systems goes wrong because of biased, late, or missing information.
Donella H. Meadows (Thinking in Systems: A Primer)
In other words, if you see a behavior that persists over time, there is likely a mechanism creating that consistent behavior.
Donella H. Meadows (Thinking in Systems: A Primer)
Pay Attention to What Is Important, Not Just What Is Quantifiable
Donella H. Meadows (Thinking in Systems: A Primer)
Every balancing feedback loop has its breakdown point, where other loops pull the stock away from its goal more strongly than it can pull back.
Donella H. Meadows (Thinking in Systems: A Primer)
A system is a set of things-people, cells, molecules, or whatever-interconnected in such a way that they produce their own pattern of behavior over time.
Donella H. Meadows (Thinking in Systems)
Storing information means increasing the complexity of the mechanism.
Donella H. Meadows (Thinking in Systems)
A stock, then, is the present memory of the history of changing flows within the system.
Donella H. Meadows (Thinking in Systems)
Stocks usually change slowly. They can act as delays, lags, buffers, ballast, and sources of momentum in a system.
Donella H. Meadows (Thinking in Systems)
Resilience, self-organization, and hierarchy are three of the reasons dynamic systems can work so well.
Donella H. Meadows (Thinking in Systems: A Primer)
The world is nonlinear. Trying to make it linear for our mathematical or administrative convenience is not usually a good idea even when feasible, and it is rarely feasible.
Donella H. Meadows (Thinking in Systems: A Primer)
System structure is the source of system behavior. System behavior reveals itself as a series of events over time.
Donella H. Meadows (Thinking in Systems: A Primer)
Keeping sub-purposes and overall system purposes in harmony is an essential function of successful systems.
Donella H. Meadows (Thinking in Systems: A Primer)
A system generally goes on being itself, changing only slowly if at all, even with complete substitutions of its elements-as long as its interconnections and purposes remain intact.
Donella H. Meadows (Thinking in Systems)
We see no reason why a sustainable world needs to leave anyone living in poverty. Quite the contrary, we think such a world would have to provide material security to all its people.
Donella H. Meadows
You think that because you understand "one" that you must therefore understand "two" because one and one make two. But you forget that you must also understand "and." -Sufi teaching story
Donella H. Meadows (Thinking in Systems)
self-organization is often sacrificed for purposes of short-term productivity and stability. Productivity and stability are the usual excuses for turning creative human beings into mechanical adjuncts to production processes. Or for narrowing the genetic variability of crop plants. Or for establishing bureaucracies and theories of knowledge that treat people as if they were only numbers.
Donella H. Meadows (Thinking in Systems: A Primer)
Dynamic systems studies usually are not designed to predict what will happen. Rather, they’re designed to explore what would happen, if a number of driving factors unfold in a range of different ways.
Donella H. Meadows (Thinking in Systems: A Primer)
Words and sentences must, by necessity, come only one at a time in linear, logical order. Systems happen all at once. They are connected not just in one direction, but in many directions simultaneously.
Donella H. Meadows (Thinking in Systems: A Primer)
Designing a system for intrinsic responsibility could mean, for example, requiring all towns or companies that emit wastewater into a stream to place their intake pipes downstream from their outflow pipe.
Donella H. Meadows (Thinking in Systems)
Hunger, poverty, environmental degradation, economic instability, unemployment, chronic disease, drug addiction, and war, for example, persist in spite of the analytical ability and technical brilliance that have been directed toward eradicating them. No one deliberately creates those problems, no one wants them to persist, but they persist nonetheless. That is because they are intrinsically systems problems—undesirable behaviors characteristic of the system structures that produce them. They will yield only as we reclaim our intuition, stop casting blame, see the system as the source of its own problems, and find the courage and wisdom to restructure it.
Donella H. Meadows (Thinking in Systems: A Primer)
In a world of complex systems, it is not appropriate to charge forward with rigid, undeviating directives. “Stay the course” is only a good idea if you’re sure you’re on course. Pretending you’re in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes. What’s appropriate when you’re learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it’s leading.
Donella H. Meadows (Thinking in Systems: A Primer)
That’s why behavior-based econometric models are pretty good at predicting the near-term performance of the economy, quite bad at predicting the longer-term performance, and terrible at telling one how to improve the performance of the economy.
Donella H. Meadows (Thinking in Systems: A Primer)
The strength of a balancing feedback loop is important relative to the impact it is designed to correct. If the impact increases in strength, the feedbacks have to be strengthened too. A thermostat system may work fine on a cold winter day—but open all the windows and its corrective power is no match for the temperature change imposed on the system. Democracy works better without the brainwashing power of centralized mass communications. Traditional controls on fishing were sufficient until sonar spotting and drift nets and other technologies made it possible for a few actors to catch the last fish. The power of big industry calls for the power of big government to hold it in check; a global economy makes global regulations necessary.
Donella H. Meadows (Thinking in Systems: A Primer)
These equalizing mechanisms may derive from simple morality, or they may come from the practical understanding that losers, if they are unable to get out of the game of success to the successful, and if they have no hope of winning, could get frustrated enough to destroy the playing field.
Donella H. Meadows (Thinking in Systems: A Primer)
embodied in the notion that there is no certainty in any worldview. But, in fact, everyone who has managed to entertain that idea, for a moment or for a lifetime, has found it to be the basis for radical empowerment. If no paradigm is right, you can choose whatever one will help to achieve your purpose.
Donella H. Meadows (Thinking in Systems: A Primer)
The thing to do, when you don’t know, is not to bluff and not to freeze, but to learn. The way you learn is by experiment—or, as Buckminster Fuller put it, by trial and error, error, error. In a world of complex systems, it is not appropriate to charge forward with rigid, undeviating directives. “Stay the course” is only a good idea if you’re sure you’re on course. Pretending you’re in control even when you aren’t is a recipe not only for mistakes, but for not learning from mistakes. What’s appropriate when you’re learning is small steps, constant monitoring, and a willingness to change course as you find out more about where it’s leading.
Donella H. Meadows (Thinking in Systems: A Primer)
When you’re walking along a tricky, curving, unknown, surprising, obstacle-strewn path, you’d be a fool to keep your head down and look just at the next step in front of you. You’d be equally a fool just to peer far ahead and never notice what’s immediately under your feet. You need to be watching both the short and the long term—the whole system.
Donella H. Meadows (Thinking in Systems: A Primer)
Everyone understands that you can prolong the life of an oil-based economy by discovering new oil deposits. It seems to be harder to understand that the same result can be achieved by burning less oil. A breakthrough in energy efficiency is equivalent, in its effect on the stock of available oil, to the discovery of a new oil field—although different people profit from it.
Donella H. Meadows (Thinking in Systems: A Primer)
Like resilience, self-organization is often sacrificed for purposes of short-term productivity and stability. Productivity and stability are the usual excuses for turning creative human beings into mechanical adjuncts to production processes. Or for narrowing the genetic variability of crop plants. Or for establishing bureaucracies and theories of knowledge that treat people as if they were only numbers. Self-organization
Donella H. Meadows (Thinking in Systems: A Primer)
The model is constructed in such a way that the global population will eventually level off and start declining, if industrial output per capita rises high enough. But we see little “real world” evidence that the richest people or nations ever lose interest in getting richer. Therefore, policies built into World3 represent the assumption that capital owners will continue to seek gains in their wealth indefinitely and that consumers will always want to increase their consumption.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
So how do you change paradigms? Thomas Kuhn, who wrote the seminal book about the great paradigm shifts of science, has a lot to say about that.8 You keep pointing at the anomalies and failures in the old paradigm. You keep speaking and acting, loudly and with assurance, from the new one. You insert people with the new paradigm in places of public visibility and power. You don’t waste time with reactionaries; rather, you work with active change agents and with the vast middle ground of people who are open-minded.
Donella H. Meadows (Thinking in Systems: A Primer)
The future can’t be predicted, but it can be envisioned and brought lovingly into being. Systems can’t be controlled, but they can be designed and redesigned. We can’t surge forward with certainty into a world of no surprises, but we can expect surprises and learn from them and even profit from them. We can’t impose our will on a system. We can listen to what the system tells us, and discover how its properties and our values can work together to bring forth something much better than could ever be produced by our will alone.
Donella H. Meadows (Thinking in Systems: A Primer)
Error-embracing is the condition for learning. It means seeking and using—and sharing—information about what went wrong with what you expected or hoped would go right. Both error embracing and living with high levels of uncertainty emphasize our personal as well as societal vulnerability. Typically we hide our vulnerabilities from ourselves as well as from others. But … to be the kind of person who truly accepts his responsibility … requires knowledge of and access to self far beyond that possessed by most people in this society.9
Donella H. Meadows (Thinking in Systems: A Primer)
There is yet one leverage point that is even higher than changing a paradigm. That is to keep oneself unattached in the arena of paradigms, to stay flexible, to realize that no paradigm is “true,” that every one, including the one that sweetly shapes your own worldview, is a tremendously limited understanding of an immense and amazing universe that is far beyond human comprehension. It is to "get" at a gut level the paradigm that there are paradigms, and to see that that itself is a paradigm, and to regard that whole realization as devastatingly funny.
Donella H. Meadows (Thinking In Systems: A Primer)
I realize with fright that my impatience for the re-establishment of democracy had something almost communist in it; or, more generally, something rationalist. I had wanted to make history move ahead in the same way that a child pulls on a plant to make it grow more quickly. I believe we must learn to wait as we learn to create. We have to patiently sow the seeds, assiduously water the earth where they are sown and give the plants the time that is their own. One cannot fool a plant any more than one can fool history. —Václav Havel,7 playwright, last President of Czechoslovakia and first president of the Czech Republic
Donella H. Meadows (Thinking in Systems: A Primer)
When a subsystem’s goals dominate at the expense of the total system’s goals, the resulting behavior is called suboptimization. Just as damaging as suboptimization, of course, is the problem of too much central control. If the brain controlled each cell so tightly that the cell could not perform its self-maintenance functions, the whole organism could die. If central rules and regulations prevent students or faculty from exploring fields of knowledge freely, the purpose of the university is not served. The coach of a team might interfere with the on-the-spot perceptions of a good player, to the detriment of the team. Economic examples of overcontrol from the top, from companies to nations, are the causes of some of the great catastrophes of history, all of which are by no means behind us.
Donella H. Meadows (Thinking in Systems: A Primer)
After basic needs are met, higher incomes produce gains in happiness only up to a point, beyond which further increases in consumption do not enhance a sense of well-being. The cumulative impact of surging per capita consumption, rapid population growth, human dominance of every ecological system, and the forcing of pervasive biological changes worldwide has created the very real possibility, according to twenty-two prominent biologists and ecologists in a 2012 study in Nature, that we may soon reach a dangerous “planetary scale ‘tipping point.’ ” According to one of the coauthors, James H. Brown, “We’ve created this enormous bubble of population and economy. If you try to get the good data and do the arithmetic, it’s just unsustainable. It’s either got to be deflated gently, or it’s going to burst.” In the parable of the boy who cried wolf, warnings of danger that turned out to be false bred complacency to the point where a subsequent warning of a danger that was all too real was ignored. Past warnings that humanity was about to encounter harsh limits to its ability to grow much further were often perceived as false: from Thomas Malthus’s warnings about population growth at the end of the eighteenth century to The Limits to Growth, published in 1972 by Donella Meadows, among others. We resist the notion that there might be limits to the rate of growth we are used to—in part because new technologies have so frequently enabled us to become far more efficient in producing more with less and to substitute a new resource for one in short supply. Some of the resources we depend upon the most, including topsoil (and some key elements, like phosphorus for fertilizers), however, have no substitutes and are being depleted.
Al Gore (The Future: Six Drivers of Global Change)
Carter also was trying to deal with a flood of illegal immigrants from Mexico. He suggested that nothing could be done about that immigration as long as there was a great gap in opportunity and living standards between the United States and Mexico. Rather than spending money on border guards and barriers, he said, we should spend money helping to build the Mexican economy, and we should continue to do so until the immigration stopped. That never happened either.
Donella H. Meadows (Thinking in Systems: A Primer)
Everything we think we know about the world is a model. Every word and every language is a model. All maps and statistics, books and databases, equations and computer programs are models. So are the ways I picture the world in my head—my mental models. None of these is or ever will be the real world. Our models usually have a strong congruence with the world. That is why we are such a successful species in the biosphere. Especially complex and sophisticated are the mental models we develop from direct, intimate experience of nature, people, and organizations immediately around us. However, and conversely, our models fall far short of representing the world fully. That is why we make mistakes and why we are regularly surprised. In our heads, we can keep track of only a few variables at one time. We often draw illogical conclusions from accurate assumptions, or logical conclusions from inaccurate assumptions. Most of us, for instance, are surprised by the amount of growth an exponential process can generate. Few of us can intuit how to damp oscillations in a complex system.
Donella H. Meadows (Thinking in Systems: A Primer)
The more users there are, the more resource is used. The more resource is used, the less there is per user.
Donella H. Meadows (Thinking in Systems: A Primer)
The only way to fix a system that is laid out poorly is to rebuild it, if you can.
Donella H. Meadows (Thinking in Systems: A Primer)
Physical structure is crucial in a system, but is rarely a leverage point, because changing it is rarely quick or simple. The leverage point is in proper design in the first place.
Donella H. Meadows (Thinking in Systems: A Primer)
As we try to imagine restructured rules and what our behavior would be under them, we come to understand the power of rules. They are high leverage points. Power over the rules is real power.
Donella H. Meadows (Thinking in Systems: A Primer)
The best way to deduce the system’s purpose is to watch for a while to see how the system behaves.
Donella H. Meadows (Thinking in Systems: A Primer)
To ask whether elements, interconnections, or purposes are most important in a system is to ask an unsystemic question. All are essential. All interact. All have their roles. But the least obvious part of the system, its function or purpose, is often the most crucial determinant of the system’s behavior.
Donella H. Meadows (Thinking in Systems: A Primer)
One promising way of redefining the meaning of ‘economist’ is to look to those who have gone beyond new economic thinking to new economic doing: the innovators who are evolving the economy one experiment at a time. Their impact is already reflected in the take-off of new business models, in the proven dynamism of the collaborative commons, in the vast potential of digital currencies and in the inspiring possibilities of regenerative design. As Donella Meadows made clear, the power of self-organisation—the ability of a system to add, change and evolve its own structure—is a high leverage point for whole system change. And that unleashes a revolutionary thought: it makes economists of us all. If economies change by evolving, then every experiment—be it a new enterprise model, complementary currency or open-source collaboration —helps to diversify, select and amplify a new economic future.
Kate Raworth (Doughnut Economics: Seven Ways to Think Like a 21st-Century Economist)
To discuss them properly, it is necessary somehow to use a language that shares some of the same properties as the phenomena under discussion. Pictures work for this language better than words, because you can see all the parts of a picture at once.
Donella H. Meadows (Thinking in Systems: A Primer)
How to know whether you are looking at a system or just a bunch of stuff: A) Can you identify parts? … and B) Do the parts affect each other? … and C) Do the parts together produce an effect that is different from the effect of each part on its own? … and perhaps D) Does the effect, the behavior over time, persist in a variety of circumstances?
Donella H. Meadows (Thinking in Systems: A Primer)
Change comes first from stepping outside the limited information that can be seen from any single place in the system and getting an overview. From a wider perspective, information flows, goals, incentives, and disincentives can be restructured so that separate, bounded, rational actions do add up to results that everyone desires. It’s amazing how quickly and easily behavior changes can come, with even slight enlargement of bounded rationality, by providing better, more complete, timelier information.
Donella H. Meadows (Thinking in Systems: A Primer)
events are the most visible aspect of a larger complex—but not always the most important.
Donella H. Meadows (Thinking in Systems: A Primer)
The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality. If everyone can work harmoniously toward the same outcome (if all feedback loops are serving the same goal), the results can be amazing. The most familiar examples of this harmonization of goals are mobilizations of economies during wartime, or recovery after war or natural disaster. Another example was Sweden’s population policy. During the 1930s, Sweden’s birth rate dropped precipitously, and, like the governments of Romania and Hungary, the Swedish government worried about that. Unlike Romania and Hungary, the Swedish government assessed its goals and those of the population and decided that there was a basis of agreement, not on the size of the family, but on the quality of child care. Every child should be wanted and nurtured. No child should be in material need. Every child should have access to excellent education and health care. These were goals around which the government and the people could align themselves. The resulting policy looked strange during a time of low birth rate, because it included free contraceptives and abortion—because of the principle that every child should be wanted. The policy also included widespread sex education, easier divorce laws, free obstetrical care, support for families in need, and greatly increased investment in education and health care.4 Since then, the Swedish birth rate has gone up and down several times without causing panic in either direction, because the nation is focused on a far more important goal than the number of Swedes.
Donella H. Meadows (Thinking in Systems: A Primer)
Another name for this system trap is “eroding goals.” It is also called the “boiled frog syndrome,” from the old story (I don’t know whether it is true) that a frog put suddenly in hot water will jump right out, but if it is put into cold water that is gradually heated up, the frog will stay there happily until it boils. “Seems to be getting a little warm in here. Well, but then it’s not so much warmer than it was a while ago.” Drift to low performance is a gradual process. If the system state plunged quickly, there would be an agitated corrective process. But if it drifts down slowly enough to erase the memory of (or belief in) how much better things used to be, everyone is lulled into lower and lower expectations, lower effort, lower performance.
Donella H. Meadows (Thinking in Systems: A Primer)
THE TRAP: DRIFT TO LOW PERFORMANCE Allowing performance standards to be influenced by past performance, especially if there is a negative bias in perceiving past performance, sets up a reinforcing feedback loop of eroding goals that sets a system drifting toward low performance. THE WAY OUT Keep performance standards absolute. Even better, let standards be enhanced by the best actual performances instead of being discouraged by the worst. Use the same structure to set up a drift toward high performance!
Donella H. Meadows (Thinking in Systems: A Primer)
God and morality are outmoded ideas; people should be objective and scientific, should own and multiply the means of production, and should treat people and nature as instrumental inputs to production”—the organizing principles of the Industrial Revolution.
Donella H. Meadows (Thinking in Systems: A Primer)
Bounded rationality means that people make quite reasonable decisions based on the information they have. But they don’t have perfect information, especially about more distant parts of the system. Fishermen don’t know how many fish there are, much less how many fish will be caught by other fishermen that same day. Businessmen don’t know for sure what other businessmen are planning to invest, or what consumers will be willing to buy, or how their products will compete. They don’t know their current market share, and they don’t know the size of the market. Their information about these things is incomplete and delayed, and their own responses are delayed. So they systematically under- and overinvest.
Donella H. Meadows (Thinking in Systems: A Primer)
Pretending that something doesn’t exist if it’s hard to quantify leads to faulty models. You’ve already seen the system trap that comes from setting goals around what is easily measured, rather than around what is important. So don’t fall into that trap. Human beings have been endowed not only with the ability to count, but also with the ability to assess quality. Be a quality detector. Be a walking, noisy Geiger counter that registers the presence or absence of quality.
Donella H. Meadows (Thinking in Systems: A Primer)
Don’t be stopped by the “if you can’t define it and measure it, I don’t have to pay attention to it” ploy. No one can define or measure justice, democracy, security, freedom, truth, or love. No one can define or measure any value. But if no one speaks up for them, if systems aren’t designed to produce them, if we don’t speak about them and point toward their presence or absence, they will cease to exist.
Donella H. Meadows (Thinking in Systems: A Primer)
A great deal of responsibility was lost when rulers who declared war were no longer expected to lead the troops into battle. Warfare became even more irresponsible when it became possible to push a button and cause tremendous damage at such a distance that the person pushing the button never even sees the damage.
Donella H. Meadows (Thinking in Systems: A Primer)
Error-embracing is the condition for learning. It means seeking and using—and sharing—information about what went wrong with what you expected or hoped would go right.
Donella H. Meadows (Thinking in Systems: A Primer)
Expand the Boundary of Caring Living successfully in a world of complex systems means expanding not only time horizons and thought horizons; above all, it means expanding the horizons of caring. There are moral reasons for doing that, of course. And if moral arguments are not sufficient, then systems thinking provides the practical reasons to back up the moral ones. The real system is interconnected. No part of the human race is separate either from other human beings or from the global ecosystem. It will not be possible in this integrated world for your heart to succeed if your lungs fail, or for your company to succeed if your workers fail, or for the rich in Los Angeles to succeed if the poor in Los Angeles fail, or for Europe to succeed if Africa fails, or for the global economy to succeed if the global environment fails. As with everything else about systems, most people already know about the interconnections that make moral and practical rules turn out to be the same rules. They just have to bring themselves to believe that which they know.
Donella H. Meadows (Thinking in Systems: A Primer)
Our culture, obsessed with numbers, has given us the idea that what we can measure is more important than what we can’t measure. Think about that for a minute. It means that we make quantity more important than quality. If quantity forms the goals of our feedback loops, if quantity is the center of our attention and language and institutions, if we motivate ourselves, rate ourselves, and reward ourselves on our ability to produce quantity, then quantity will be the result. You can look around and make up your own mind about whether quantity or quality is the outstanding characteristic of the world in which you live.
Donella H. Meadows (Thinking in Systems: A Primer)
If the land mechanism as a whole is good, then every part is good, whether we understand it or not. If the biota, in the course of aeons, has built something we like but do not understand, then who but a fool would discard seemingly useless parts? To keep every cog and wheel is the first precaution of intelligent tinkering. —Aldo Leopold,1 forester
Donella H. Meadows (Thinking in Systems: A Primer)
There once were two watchmakers, named Hora and Tempus. Both of them made fine watches, and they both had many customers. People dropped into their stores, and their phones rang constantly with new orders. Over the years, however, Hora prospered, while Tempus became poorer and poorer. That’s because Hora discovered the principle of hierarchy.… The watches made by both Hora and Tempus consisted of about one thousand parts each. Tempus put his together in such a way that if he had one partly assembled and had to put it down—to answer the phone, say—it fell to pieces. When he came back to it, Tempus would have to start all over again. The more his customers phoned him, the harder it became for him to find enough uninterrupted time to finish a watch. Hora’s watches were no less complex than those of Tempus, but he put together stable subassemblies of about ten elements each. Then he put ten of these subassemblies together into a larger assembly; and ten of those assemblies constituted the whole watch. Whenever Hora had to put down a partly completed watch to answer the phone, he lost only a small part of his work. So he made his watches much faster and more efficiently than did Tempus.
Donella H. Meadows (Thinking in Systems: A Primer)
The trouble … is that we are terrifyingly ignorant. The most learned of us are ignorant.… The acquisition of knowledge always involves the revelation of ignorance—almost is the revelation of ignorance. Our knowledge of the world instructs us first of all that the world is greater than our knowledge of it. —Wendell Berry,1 writer and Kentucky farmer
Donella H. Meadows (Thinking in Systems: A Primer)
Because of feedback delays within complex systems, by the time a problem becomes apparent it may be unnecessarily difficult to solve.
Donella H. Meadows (Thinking In Systems: A Primer)
At a time when the world is more messy, more crowded, more interconnected, more interdependent, and more rapidly changing than ever before, the more ways of seeing, the better.
Donella H. Meadows (Thinking In Systems: A Primer)
[...] you don't expect things to happen faster than they can happen. You don't give up too soon.
Donella H. Meadows (Thinking In Systems: A Primer)
It's a guess about the future, and the future is inherently uncertain. Although you may have a strong opinion about it, there's no way to prove you're right until the future actually happens.
Donella H. Meadows (Thinking In Systems: A Primer)
A little tasteful advertising can awaken interest in a product. A lot of blatant advertising can cause disgust for the product.
Donella H. Meadows (Thinking In Systems: A Primer)
Work in such a way as to restore or enhance the system's own ability to solve its problems, then remove yourself
Donella H. Meadows (Thinking In Systems: A Primer)
No paradigm is "true", every one, including the one that sweetly shapes your own worldview, is a tremendously limited understanding of an immense and amazing universe that is far beyond human comprehension.
Donella H. Meadows (Thinking In Systems: A Primer)
If no paradigm is right, you can choose whatever one will help to achieve your purpose.
Donella H. Meadows (Thinking In Systems: A Primer)
Why do systems work so well? Consider the properties of highly functional systems—machines or human communities or ecosystems—which are familiar to you. Chances are good that you may have observed one of three characteristics: resilience, self-organization, or hierarchy.
Donella H. Meadows (Thinking in Systems: A Primer)
[Evolution] appears to be not a series of accidents the course of which is determined only by the change of environments during earth history and the resulting struggle for existence, … but is governed by definite laws.… The discovery of these laws constitutes one of the most important tasks of the future. —Ludwig von Bertalanffy,3 biologist
Donella H. Meadows (Thinking in Systems: A Primer)
[Evolution] appears to be not a series of accidents the course of which is determined only by the change of environments during earth history and the resulting struggle for existence, … but is governed by definite laws.… The discovery of these laws constitutes one of the most important tasks of the future. —Ludwig von Bertalanffy,3 biologist The most marvelous characteristic of some complex systems is their ability to learn, diversify, complexify, evolve. It is the ability of a single fertilized ovum to generate, out of itself, the incredible complexity of a mature frog, or chicken, or person. It is the ability of nature to have diversified millions of fantastic species out of a puddle of organic chemicals. It is the ability of a society to take the ideas of burning coal, making steam, pumping water, and specializing labor, and develop them eventually into an automobile assembly plant, a city of skyscrapers, a worldwide network of communications.
Donella H. Meadows (Thinking in Systems: A Primer)
Self-organization produces heterogeneity and unpredictability. It is likely to come up with whole new structures, whole new ways of doing things. It requires freedom and experimentation, and a certain amount of disorder. These conditions that encourage self-organization often can be scary for individuals and threatening to power structures.
Donella H. Meadows (Thinking in Systems: A Primer)
A quantity growing exponentially toward a constraint or limit reaches that limit in a surprisingly short time.
Donella H. Meadows (Thinking in Systems: A Primer)
Remember—all system diagrams are simplifications of the real world.
Donella H. Meadows (Thinking in Systems: A Primer)
When there are long delays in feedback loops, some sort of foresight is essential. To act only when a problem becomes obvious is to miss an important opportunity to solve the problem.
Donella H. Meadows (Thinking in Systems: A Primer)
Change comes first from stepping outside the limited information that can be seen from any single place in the system and getting an overview. From a wider perspective, information flows, goals, incentives, and disincentives can be restructured so that separate, bounded, rational actions do add up to results that everyone desires.
Donella H. Meadows (Thinking in Systems: A Primer)
The most effective way of dealing with policy resistance is to find a way of aligning the various goals of the subsystems, usually by providing an overarching goal that allows all actors to break out of their bounded rationality. If everyone can work harmoniously toward the same outcome (if all feedback loops are serving the same goal), the results can be amazing. The most familiar examples of this harmonization of goals are mobilizations of economies during wartime, or recovery after war or natural disaster.
Donella H. Meadows (Thinking in Systems: A Primer)
Intrinsic responsibility” means that the system is designed to send feedback about the consequences of decision making directly and quickly and compellingly to the decision makers. Because the pilot of a plane rides in the front of the plane, that pilot is intrinsically responsible. He or she will experience directly the consequences of his or her decisions.
Donella H. Meadows (Thinking in Systems: A Primer)
In a strict systems sense, there is no long-term, short-term distinction. Phenomena at different time-scales are nested within each other. Actions taken now have some immediate effects and some that radiate out for decades to come. We experience now the consequences of actions set in motion yesterday and decades ago and centuries ago.
Donella H. Meadows (Thinking in Systems: A Primer)
Information is power. Anyone interested in power grasps that idea very quickly. The media, the public relations people, the politicians, and advertisers who regulate much of the public flow of information have far more power than most people realize. They filter and channel information. Often they do so for short-term, self-interested purposes. It’s no wonder that our social systems so often run amok.
Donella H. Meadows (Thinking in Systems: A Primer)
Markets tend toward monopoly and ecological niches toward monotony, but they also create offshoots of diversity, new markets, new species, which in the course of time may attract competitors, which then begin to move the system toward competitive exclusion again.
Donella H. Meadows (Thinking in Systems: A Primer)
Missing information flows is one of the most common causes of system malfunction. Adding or restoring information can be a powerful intervention, usually much easier and cheaper than rebuilding physical infrastructure. The tragedy of the commons that is crashing the world’s commercial fisheries occurs because there is little feedback from the state of the fish population to the decision to invest in fishing vessels. Contrary to economic opinion, the price of fish doesn’t provide that feedback. As the fish get more scarce they become more expensive, and it becomes all the more profitable to go out and catch the last few. That’s a perverse feedback, a reinforcing loop that leads to collapse. It is not price information but population information that is needed.
Donella H. Meadows (Thinking in Systems: A Primer)
In May 1985 the historic paper was published that announced an “ozone hole” in the Southern Hemisphere.11 The news shocked the scientific world. If true, the results proved that humankind had already exceeded a global limit. CFC use had grown above sustainable limits. Humans were already in the process of destroying their ozone shield. Scientists at the National Aeronautics and Space Administration of the United States (NASA) scrambled to check readings on atmospheric ozone made by the Nimbus 7 satellite, measurements that had been taken routinely since 1978. Nimbus 7 had never indicated an ozone hole. Checking back, NASA scientists found that their computers had been programmed to reject very low ozone readings on the assumption that such low readings must indicate instrument error.12 Fortunately the measurements thrown out by the computer were recoverable. They confirmed the Halley Bay observations, showing that ozone levels had been dropping over the South Pole for a decade. Furthermore, they provided a detailed map of the hole in the ozone layer. It was enormous, about the size of the continental United States, and it had been getting larger and deeper every year.
Donella H. Meadows (Limits to Growth: The 30-Year Update)
Everything is connected to everything else, and not neatly.
Donella H. Meadows (Thinking In Systems: A Primer)
Resilience, self-organization, and hierarchy are three of the reasons dynamic systems can work so well. Promoting or managing for these properties of a system can improve its ability to function well over the long term—to be sustainable
Donella H. Meadows (Thinking in Systems: A Primer)
Everything we think we know about the world is a model. Every word and every language is a model. All maps and statistics, books and databases, equations and computer programs are models.
Donella H. Meadows (Thinking in Systems: A Primer)
our models fall far short of representing the world fully. That is why we make mistakes and why we are regularly surprised. In our heads, we can keep track of only a few variables at one time. We often draw illogical conclusions from accurate assumptions, or logical conclusions from inaccurate assumptions.
Donella H. Meadows (Thinking in Systems: A Primer)
You can’t navigate well in an interconnected, feedback-dominated world unless you take your eyes off short-term events and look for long-term behavior and structure; unless you are aware of false boundaries and bounded rationality; unless you take into account limiting factors, nonlinearities and delays. You are likely to mistreat, misdesign, or misread systems if you don’t respect their properties of resilience, self-organization, and hierarchy.
Donella H. Meadows (Thinking in Systems: A Primer)
Systems fool us by presenting themselves—or we fool ourselves by seeing the world—as a series of events. The daily news tells of elections, battles, political agreements, disasters, stock market booms or busts. Much of our ordinary conversation is about specific happenings at specific times and places. A team wins. A river floods. The Dow Jones Industrial Average hits 10,000. Oil is discovered. A forest is cut. Events are the outputs, moment by moment, from the black box of the system.
Donella H. Meadows (Thinking in Systems: A Primer)
long-term behavior provides clues to the underlying system structure. And structure is the key to understanding not just what is happening, but why.
Donella H. Meadows (Thinking in Systems: A Primer)
A linear relationship between two elements in a system can be drawn on a graph with a straight line. It’s a relationship with constant proportions. If I put 10 pounds of fertilizer on my field, my yield will go up by 2 bushels. If I put on 20 pounds, my yield will go up by 4 bushels. If I put on 30 pounds, I’ll get an increase of 6 bushels.
Donella H. Meadows (Thinking in Systems: A Primer)
A nonlinear relationship is one in which the cause does not produce a proportional effect. The relationship between cause and effect can only be drawn with curves or wiggles, not with a straight line. If I put 100 pounds of fertilizer on, my yield will go up by 10 bushels; if I put on 200, my yield will not go up at all; if I put on 300, my yield will go down. Why? I’ve damaged my soil with “too much of a good thing.
Donella H. Meadows (Thinking in Systems: A Primer)
If a factory is torn down but the rationality which produced it is left standing, then that rationality will simply produce another factory. If a revolution destroys a government, but the systematic patterns of thought that produced that government are left intact, then those patterns will repeat themselves.… There’s so much talk about the system. And so little understanding. —ROBERT PIRSIG, Zen and the Art of Motorcycle Maintenance
Donella H. Meadows (Thinking in Systems: A Primer)
So, what is a system? A system is a set of things—people, cells, molecules, or whatever—interconnected in such a way that they produce their own pattern of behavior over time. The system may be buffeted, constricted, triggered, or driven by outside forces. But the system’s response to these forces is characteristic of itself, and that response is seldom simple in the real world.
Donella H. Meadows (Thinking in Systems: A Primer)
We are complex systems—our own bodies are magnificent examples of integrated, interconnected, self-maintaining complexity. Every person we encounter, every organization, every animal, garden, tree, and forest is a complex system.
Donella H. Meadows (Thinking in Systems: A Primer)
A system* is an interconnected set of elements that is coherently organized in a way that achieves something. If you look at that definition closely for a minute, you can see that a system must consist of three kinds of things: elements, interconnections, and a function or purpose.
Donella H. Meadows (Thinking in Systems: A Primer)
I’d need rest to refresh my brain, and to get rest it’s necessary to travel, and to travel one must have money, and in order to get money you have to work.… I am in a vicious circle … from which it is impossible to escape. —Honoré Balzac,4 19th century novelist and playwright
Donella H. Meadows (Thinking in Systems: A Primer)
If A causes B, is it possible that B also causes A? You’ll be thinking not in terms of a static world, but a dynamic one. You’ll stop looking for who’s to blame; instead you’ll start asking, “What’s the system?” The concept of feedback opens up the idea that a system can cause its own behavior
Donella H. Meadows (Thinking in Systems: A Primer)
The information delivered by a feedback loop—even nonphysical feedback—can only affect future behavior; it can’t deliver a signal fast enough to correct behavior that drove the current feedback. Even nonphysical information takes time to feedback into the system.
Donella H. Meadows (Thinking in Systems: A Primer)
Systems surprise us because our minds like to think about single causes neatly producing single effects. We like to think about one or at most a few things at a time. And we don’t like, especially when our own plans and desires are involved, to think about limits.
Donella H. Meadows (Thinking in Systems: A Primer)
If the desired system state is good education, measuring that goal by the amount of money spent per student will ensure money spent per student. If the quality of education is measured by performance on standardized tests, the system will produce performance on standardized tests. Whether either of these measures is correlated with good education is at least worth thinking about.
Donella H. Meadows (Thinking in Systems: A Primer)
This idea of leverage points is not unique to systems analysis—it’s embedded in legend: the silver bullet; the trimtab; the miracle cure; the secret passage; the magic password; the single hero who turns the tide of history; the nearly effortless way to cut through or leap over huge obstacles. We not only want to believe that there are leverage points, we want to know where they are and how to get our hands on them. Leverage points are points of power.
Donella H. Meadows (Thinking in Systems: A Primer)
Another of Forrester’s classics was his study of urban dynamics, published in 1969, which demonstrated that subsidized low-income housing is a leverage point.3 The less of it there is, the better off the city is—even the low-income folks in the city. This model came out at a time when national policy dictated massive low-income housing projects, and Forrester was derided. Since then, many of those projects have been torn down in city after city. Counterintuitive—that’s Forrester’s word to describe complex systems. Leverage points frequently are not intuitive. Or if they are, we too often use them backward, systematically worsening whatever problems we are trying to solve.
Donella H. Meadows (Thinking in Systems: A Primer)
Harmonization of goals in a system is not always possible, but it’s an option worth looking for. It can be found only by letting go of more narrow goals and considering the long-term welfare of the entire system.
Donella H. Meadows (Thinking in Systems: A Primer)
The trap called the tragedy of the commons comes about when there is escalation, or just simple growth, in a commonly shared, erodable environment. Ecologist Garrett Hardin described the commons system in a classic article in 1968.
Donella H. Meadows (Thinking in Systems: A Primer)
There are two antidotes to eroding goals. One is to keep standards absolute, regardless of performance. Another is to make goals sensitive to the best performances of the past, instead of the worst. If perceived performance has an upbeat bias instead of a downbeat one, if one takes the best results as a standard, and the worst results only as a temporary setback, then the same system structure can pull the system up to better and better performance.
Donella H. Meadows (Thinking in Systems: A Primer)
There is a systematic tendency on the part of human beings to avoid accountability for their own decisions. That’s why there are so many missing feedback loops—and why this kind of leverage point is so often popular with the masses, unpopular with the powers that be, and effective, if you can get the powers that be to permit it to happen (or go around them and make it happen anyway).
Donella H. Meadows (Thinking in Systems: A Primer)
Nonrenewable resources are stock-limited. The entire stock is available at once, and can be extracted at any rate (limited mainly by extraction capital). But since the stock is not renewed, the faster the extraction rate, the shorter the lifetime of the resource.
Donella H. Meadows (Thinking in Systems: A Primer)
Renewable resources are flow-limited. They can support extraction or harvest indefinitely, but only at a finite flow rate equal to their regeneration rate. If they are extracted faster than they regenerate, they may eventually be driven below a critical threshold and become, for all practical purposes, nonrenewable.
Donella H. Meadows (Thinking in Systems: A Primer)
A delay in a balancing feedback loop makes a system likely to oscillate.
Donella H. Meadows (Thinking in Systems: A Primer)
any physical, growing system is going to run into some kind of constraint, sooner or later. That constraint will take the form of a balancing loop that in some way shifts the dominance of the reinforcing loop driving the growth behavior, either by strengthening the outflow or by weakening the inflow.
Donella H. Meadows (Thinking in Systems: A Primer)
Growth in a constrained environment is very common, so common that systems thinkers call it the “limits-to-growth” archetype.
Donella H. Meadows (Thinking in Systems: A Primer)
Whenever we see a growing entity, whether it be a population, a corporation, a bank account, a rumor, an epidemic, or sales of a new product, we look for the reinforcing loops that are driving it and for the balancing loops that ultimately will constrain it. We know those balancing loops are there, even if they are not yet dominating the system’s behavior, because no real physical system can grow forever.
Donella H. Meadows (Thinking in Systems: A Primer)
The limits on a growing system may be temporary or permanent. The system may find ways to get around them for a short while or a long while, but eventually there must come some kind of accommodation, the system adjusting to the constraint, or the constraint to the system, or both to each other.
Donella H. Meadows (Thinking in Systems: A Primer)
Delays are pervasive in systems, and they are strong determinants of behavior. Changing the length of a delay may (or may not, depending on the type of delay and the relative lengths of other delays) make a large change in the behavior of a system.
Donella H. Meadows (Thinking in Systems: A Primer)
A population has a reinforcing loop causing it to grow through its birth rate, and a balancing loop causing it to die off through its death rate.
Donella H. Meadows (Thinking in Systems: A Primer)
This behavior is an example of shifting dominance of feedback loops. Dominance is an important concept in systems thinking. When one loop dominates another, it has a stronger impact on behavior. Because systems often have several competing feedback loops operating simultaneously, those loops that dominate the system will determine the behavior.
Donella H. Meadows (Thinking in Systems: A Primer)
Complex behaviors of systems often arise as the relative strengths of feedback loops shift, causing first one loop and then another to dominate behavior.
Donella H. Meadows (Thinking in Systems: A Primer)
During the 1930s, Sweden’s birth rate dropped precipitously, and, like the governments of Romania and Hungary, the Swedish government worried about that. Unlike Romania and Hungary, the Swedish government assessed its goals and those of the population and decided that there was a basis of agreement, not on the size of the family, but on the quality of child care. Every child should be wanted and nurtured. No child should be in material need. Every child should have access to excellent education and health care. These were goals around which the government and the people could align themselves. The resulting policy looked strange during a time of low birth rate, because it included free contraceptives and abortion—because of the principle that every child should be wanted. The policy also included widespread sex education, easier divorce laws, free obstetrical care, support for families in need, and greatly increased investment in education and health care.4 Since then, the Swedish birth rate has gone up and down several times without causing panic in either direction, because the nation is focused on a far more important goal than the number of Swedes.
Donella H. Meadows (Thinking in Systems: A Primer)
THE TRAP: POLICY RESISTANCE When various actors try to pull a system stock toward various goals, the result can be policy resistance. Any new policy, especially if it’s effective, just pulls the stock farther from the goals of other actors and produces additional resistance, with a result that no one likes, but that everyone expends considerable effort in maintaining. THE WAY OUT Let go. Bring in all the actors and use the energy formerly expended on resistance to seek out mutually satisfactory ways for all goals to be realized—or redefinitions of larger and more important goals that everyone can pull toward together.
Donella H. Meadows (Thinking in Systems: A Primer)
How do you break out of the trap of success to the successful? Species and companies sometimes escape competitive exclusion by diversifying. A species can learn or evolve to exploit new resources. A company can create a new product or service that does not directly compete with existing ones. Markets tend toward monopoly and ecological niches toward monotony, but they also create offshoots of diversity, new markets, new species, which in the course of time may attract competitors, which then begin to move the system toward competitive exclusion again.
Donella H. Meadows (Thinking in Systems: A Primer)
Faith in technology as the ultimate solution to all problems can thus divert our attention from the most fundamental problem-the problem of growth in a finite system and prevent us from taking effective action to solve it.
Donella H. Meadows (The Limits to Growth)
Diversification is not guaranteed, however, especially if the monopolizing firm (or species) has the power to crush all offshoots, or buy them up, or deprive them of the resources they need to stay alive. Diversification doesn’t work as a strategy for the poor.
Donella H. Meadows (Thinking in Systems: A Primer)
These examples confuse effort with result, one of the most common mistakes in designing systems around the wrong goal.
Donella H. Meadows (Thinking in Systems: A Primer)
The GNP lumps together goods and bads. (If there are more car accidents and medical bills and repair bills, the GNP goes up.) It counts only marketed goods and services. (If all parents hired people to bring up their children, the GNP would go up.) It does not reflect distributional equity. (An expensive second home for a rich family makes the GNP go up more than an inexpensive basic home for a poor family.) It measures effort rather than achievement, gross production and consumption rather than efficiency. New light bulbs that give the same light with one-eighth the electricity and that last ten times as long make the GNP go down. GNP is a measure of throughput—flows of stuff made and purchased in a year—rather than capital stocks, the houses and cars and computers and stereos that are the source of real wealth and real pleasure.
Donella H. Meadows (Thinking in Systems: A Primer)
Delays in feedback loops are critical determinants of system behavior. They are common causes of oscillations. If you’re trying to adjust a stock (your store inventory) to meet your goal, but you receive only delayed information about what the state of the stock is, you will overshoot and undershoot your goal. The same is true if your information is timely, but your response isn’t. For example, it takes several years to build an electric power plant that will likely last thirty years. Those delays make it impossible to build exactly the right number of power plants to supply rapidly changing demand for electricity. Even with immense effort at forecasting, almost every electricity industry in the world experiences long oscillations between overcapacity and undercapacity. A system just can’t respond to short-term changes when it has long-term delays. That’s why a massive central-planning system, such as the Soviet Union or General Motors, necessarily functions poorly.
Donella H. Meadows (Thinking in Systems: A Primer)