Book Excerpts: Systems 1

Systems 1

By Draper L. Kauffman. 1980.

 
Foreword

Both in school and out, most people are exposed to knowledge in bits and pieces and get very little help in tying these chunks together into an overall pattern that makes sense. Whether you can create such a pattern for yourself over the years is largely up to you, but a systems approach at least provides a consistent frame of reference and a way of fitting the pieces together as you come to them.

 
Chapter 3: Things In Common

If you want to change a situation which is controlled by a negative feedback loop, it is much better to try to change the way that pieces interact than to try to “out-muscle” the system. But that means that you first have to figure out what the system is and how it works.

 
Appendix: System Notes

1. Everything is connected to everything else. Real life is lived in a complex world system where all the subsystems overlap and affect each other. The common mistake is to deal with one subsystem in isolation, as if it didn’t connect with anything else. This almost always backfires as other subsystems respond in unanticipated ways.

2. You can never do just one thing. This follows from rule #1: in addition to the immediate effects of an action, there will always be other consequences of it which ripple through the system.

3. There is no “away.” Another corollary of #1. In natural ecosystems, in particular, you can move something from one place to another, you can transform it into something else, but you can’t get rid of it. As long as it is on the Earth, it is part of the global ecosystem. The industrial poisons, pollutants, insecticides, and radioactive materials that we’ve tried to “throw away” in the past have all too often come back to haunt us because people didn’t understand this rule.

4. TANSTAAFL: There Ain’t No Such Thing As A Free Lunch. Years ago, bars used to offer a “free lunch” as a way to draw customers. Of course, the drinks in those bars cost twice as much, so the lunches weren’t really “free” at all. Similarly, in complex systems, what looks like the cheapest solution to a problem often turns out to be the most expensive one in the long run. TANSTAAFL is a way of saying, “Don’t expect something for nothing, there’s always a hidden cost somewhere.”

5. Nature knows best. Natural ecosystems evolved over millions of years, and everything in them has a role to play. Be very suspicious of any proposal to alter or eliminate any “useless” part of the system. If it looks useless, it just means that you don’t understand its function and the risk of doing harm is that much greater. (For example, people have been draining “useless” marshes and swamps for centuries. Now, it turns out that these areas are vital for removing water pollution and as breeding grounds for economically important wildlife and fisheries. An area worth $10,000 as dry land may produce $100,000 worth of fish a year as a marsh.) When in doubt, be careful, and always try to find a “natural” solution to the problem if at all possible.

6. It ain’t what you don’t know that hurts you; it’s what you DO know that ain’t so. Beware of false assumptions about system behavior. When we are sure of something, we usually don’t bother to look for proof that it is true and we may be blind to evidence that it is false. We are much more likely to make really big blunders when we act on false assumptions than when we are uncertain and aware of our own uncertainty.

7. “Obvious solutions” do more harm than good. All complex systems use negative feedback to negate external changes in the system. If you try to change something in the direct, “obvious” way, the system is going to treat your efforts like any other outside influence and do its best to neutralize them. The more energy you waste fighting the system head on, the more energy it will waste fighting back, and any gains you make will be only temporary at best. Finally, if you try hard enough and long enough, you will exhaust the system’s ability to fight back—at which point the system will break down completely!

8. Look for high leverage points. Nearly every feedback system has weak spots. These are almost always the control points which measure the system’s behavior and determine its response to change. The best way to change a system’s behavior is either to change the “setting” of the control unit or to change the information which the control unit receives. If you want to make a cold house warmer, turn the thermostat up or stick an ice pack on it, but don’t build a fire in the fireplace—it won’t do any good.

9. Nothing grows forever. The exponential growth curves produced by positive feedback keep on growing only in mathematics. In the real world growth always stops sooner or later, and the faster the growth, the sooner it will stop. If the Earth’s human population could continue to grow at its current rate for another seven centuries, we would be the only living things on the planet. After just ten more centuries, the mass of bodies would outweigh the entire rest of the planet—an obvious impossibility. If energy use continued to grow at its current rate for another 400 years, the surface of the earth would be hotter than the sun. And at current rates of growth in food consumption, we would have to eat every thing on the planet in a single year only five centuries from now. Obviously, these projections are ridiculous and the growth of population, energy use, and food consumption will stop long before such extremes are reached. The question is, how soon and in what way?

10. Don’t fight positive feedback; support negative feedback instead. Don’t poison pests, support their predators. Don’t order people to have fewer children, make it more profitable for them to have small families instead. Don’t ration energy, raise the price instead (and give the money back by cutting taxes somewhere else, like the social security tax). And so on. England used a version of this rule for centuries in European politics. Whenever one nation or group got too strong, England would throw its support to the weaker side. (Don’t try to weaken your enemy, strengthen your enemy’s enemies instead.)

11. Don’t try to control the players, just change the rules. When the National Football League wanted to make football games a bit more exciting, it could have ordered quarterbacks to throw more passes. If it had, teams would have looked for ways to evade the order, perhaps by throwing a few more short, safe passes, and the game would still have been dull. Instead, the league changed the rules slightly so that pass plays would have a better chance of working. As a result, teams were aggressive about taking advantage of the new opportunities to pass. The same principle applies in economics, politics, science, education, and many other areas. If the system tries to make choices for people, the people will try to outwit the system. It is much more effective to change the “rules of the game” so that it is to most people’s advantage to make the choices that are good for the whole system.

12. Don’t make rules that can’t be enforced. If many people want to disobey a law and nearly all of them are able to get away with it, then the law will not be obeyed. But this gets people used to disobeying laws, and it reduces respect for laws in general. It also creates ideal opportunities for corruption, blackmail, and the acceptance of organized crime. A society that really gets serious about enforcing unenforceable laws can tear itself apart. (See, for example, the tremendous damage done by witch-hunts, inquisitions, and civil wars that result from enforcing laws against thinking certain kinds of religious or political thoughts.) The same problem arises in business, government, and many other kinds of systems, where a higher level system is weakened by trying to overcontrol lower sub-systems.

13. There are no simple solutions. Real-life systems are big, messy, complicated things, with problems to match. Genuine solutions require careful thought for their effect on the whole system. Anyone who tries to sell you a simple answer—”All we have to do is . . . and everything will be perfect!”—is either honestly dumb, or dishonest and probably running for office.

14. Good intentions are not enough. Few things are more painful than trying to do good and finding out that you’ve done a great deal of harm instead. Simple compassion and simple morality are inadequate in a complex world. The bumbling missionary causes tragedy because he follows his heart without using his head to try to understand the whole situation.

15. High morality depends on accurate prophecy. You cannot judge the morality of an action unless you have some idea of what the consequences of the action will be. According to this point of view, an action cannot be good if it has evil results, and everyone has a moral obligation to try to foresee, as well as possible, what the results of various decisions will be.

16. If you can’t make people self-sufficient, your aid does more harm than good. This usually comes up in discussing problems of poverty or hunger, where temporary relief often postpones the disaster at the cost of making it much worse when it comes. It is not really an argument against helping, but an argument against half-way measures. Gandhi said the same thing in a more positive way: “If you give me a fish, I eat for a day; if you teach me to fish, I eat for a lifetime.”

17. There are no final answers. As Ken Boulding put it, “If all environments were stable, the well-adapted would simply take over the earth and the evolutionary process would stop. In a period of environmental change, however, it is the adaptable, not the well-adapted who survive.” This applies to social systems as well as natural ones. In a time of rapid change, like the present, the best “solution” to a problem is often one that just keeps the problem under control while keeping as many options open for the future as possible.

18. Every solution creates new problems. The auto solved the horse-manure pollution problem and created an air pollution problem. Modern medicine brought us longer, healthier lives—and a population explosion that threatens to produce a global famine. Television brings us instant access to vital information and world events—and a mind-numbing barrage of banality and violence. And so on. The important things is to try to anticipate the new problems and decide whether we prefer them to the problems we are currently trying to solve. Sometimes the “best” solution to one problem just creates a worse problem. There may even be no solution to the new problem. On the other hand, an apparently “inferior” solution to the original problem may be much better for the whole system in the long run.

19. Loose systems are often better. Diverse, decentralized systems often seem disorganized and wasteful, but they are almost always more stable, flexible, and efficient in the long run than “neater” systems. In Boulding’s terms (#17), highly adaptable systems look sloppy compared to systems that are well-adapted to a specific situation, but the sloppy-looking systems are the ones that will survive. In addition, systems which are loose enough to tolerate moderate fluctuations in things like population levels, food supply, or prices, are more efficient than systems which waste energy and resources on tighter controls.

20. Don’t be fooled by system cycles. All negative feedback loops create oscillations—some large, some small. For some reason, many people are unable to deal with or believe in cyclical patterns, especially if the cycles are more than two or three years in length. If the economy has been growing steadily for the last four years, nearly everyone will be optimistic. They simply project their recent experience ahead into the future, forgetting that a recession becomes more likely the longer the boom continues. Similarly, everyone is gloomiest at the bottom of a recession, just when rapid growth is most likely.

Highly visible job categories often fluctuate in the same way. When a temporary oversupply of workers develops in a particular field, everyone talks about the big surplus and young people are steered away from the field. Within a few years, this creates a shortage, jobs go begging, and young people are frantically urged into the field—which creates a surplus. Obviously, the best time to start training for such a job is when people have been talking about a surplus for several years and few others are entering it. That way, you finish your training just as the shortage develops.

The problem is that most people have short memories and tend to project the recent past forward on a straight line. As a result you get this kind of pattern:

21. Remember the Golden Mean. When people face a serious problem, they tend to overvalue anything that helps solve it. They mobilize their energies and fight hard to solve the problem, and often keep right on going after the problem is solved and their solution is becoming a new problem. When most children died before their tenth birthdays, a high birth rate was essential for survival and societies developed powerful ways to encourage people to have large families. When the death rate is reduced, a high birth rate becomes a liability, but all those strong cultural forces keep right on encouraging large families, and it can take generations for people’s attitudes to change. Like the man who eats himself to death as an adult because he was always hungry as a child, people tend to forget that too much of something can be as bad as too little. They assume that if more of something is good, a lot more must be better—but it often isn’t. The trick is to recognize these situations and try to swing the pendulum back to the middle whenever it swings toward either extreme.

22. Beware the empty compromise. There are also times when the middle ground is, worse than either extreme. There’s an old, old fable about an ass who starved to death halfway between two bales of hay because it couldn’t make up its mind which one to eat first. Sometimes you just have to choose, because a compromise won’t work. The only way to tell is to examine the entire system carefully and try to anticipate what the results of different decisions will be.

23. Don’t be a boiled frog. Some systems are designed so that they can react to any change that is larger than a certain amount, but they can’t respond to changes that are below that threshold. For example, if a frog is put in a pan of hot water, he will jump right out. But if he is put in a pan of cool water and the water is then gradually heated up, the frog will happily sit there and let himself be cooked, As long as the change is slow enough, it doesn’t trigger a response. Sometimes a country can use this tactic to defeat an enemy in a patient series of small steps, Each step weakens the opponent a little bit, but is “not worth going to war over,” until finally the victim is too weak to resist an attack, (These are sometimes called “salami-slicing tactics.” “Divide and conquer” is another version of the same thing.) While a healthy system shouldn’t overreact to small changes, it has to be able to identify and respond to a series of small changes that will bring disaster if allowed to continue.

24. Watch out for thresholds. Most systems change pretty gradually. But some systems are designed to switch abruptly from one kind of behavior to a completely different kind. Sometimes this is a defense against the “boiled frog” problem. (He’s meek as a lamb until you push him too far. Then you’d better watch out!”) in other cases, it’s a way of avoiding “empty compromises” (#22). But most often it’s because the system, or a subsystem of it, has exhausted its reserves for coping with some pressure on it. (See the discussion of exposure and heat-stroke in Part One.) This can be disastrous if you are relying on a system that has seemed able to absorb a lot of abuse and it suddenly collapses as a result of something apparently trivial. Democracies, market economies, and natural ecosystems are all prone to behave in this way. They seem so sturdy that we can kick them around, interfere with subsystem after subsystem, increase the load more and more, and they will always bounce back. But we can never be sure which straw is going to break the camel’s back.

25. Competition is often cooperation in disguise. A chess player may push himself to the limit in his desire to defeat his opponent, and yet be very upset if he finds out that his opponent let him win. What appears to be a fierce competition on one level is actually part of a larger system in which both players cooperate in a ritual that gives both of them pleasure. Not “doing your best” is a violation of that cooperative agreement. Similarly, the competitions between two lawyers in a courtroom is an essential part of a larger process in which lawyers, judge, and jury cooperate in a search for just answers. Businesses cooperate to keep the economy running efficiently by competing with each other in the marketplace. Political parties cooperate in running a democracy by competing with each other at the polls. And so on.

How do you tell cooperative competition from destructive competition? In cooperative competition, the opponents are willing to fight by the rules and accept the outcome of a fair contest, even if it goes against them, because they know the game will continue and they will get another chance. One reason extremist movements like communism or fascism are dangerous in a democracy is that they turn politics into destructive competition, aimed at a total victory which would put an end to the competition.

26. Bad boundaries make bad governments. Unlike most cities, St. Louis is not part of a larger county. St. Louis County surrounds the city and keeps it from expanding its city limits. As a result, the communities in the county have become parasites on the city, using the city’s commercial and cultural resources but contributing nothing toward the cost of maintaining them. As long as there is a boundary that splits the metropolitan area in half, and no government with authority over the whole area, the county will keep getting richer and the city will keep getting poorer until urban decay completely destroys it. Similar boundary problems afflict States, nations, ecosystems, and economic regions. As a general rule, the system with responsibility for a problem should include the entire problem area; authority must be congruent with responsibility or a commons problem (#27) results.

27. Beware the Tragedy of the Commons. A “commons” problem occurs when subsystems in a competitive relationship with each other are forced to act in ways that are destructive of the whole system. Usually, the source of the problem is the right of a subsystem to receive the whole benefit from using a resource while paying only a small part of the cost for it. The solution is either to divide the common resource up (not always possible) or to limit access to it.

28. Foresight always wins in the long run. Solutions to problems affecting complex systems usually take time. If we wait until the problem develops and then react to it, there may not be time for any good solutions before a crisis point is reached. If we look ahead and anticipate a problem, however, we usually have more choices and a better chance of heading the problem off before it disrupts things. Reacting to problems means letting the system control us. Only by using foresight do we have a real chance to control the system. Or: Those who do not try to create the future they want must endure the future they get.

Leave a Reply

Your email address will not be published. Required fields are marked *