For an industry that prides itself on analytical ability and abstract mental processing, we often don't do a great job applying that mental skill to the most important element of the programmer's tool chest: ourselves.

“What's the Law of Unintended Consequences?”

My CEO/business partner recently asked me this question as a result of a thread of conversation we were having around… well, you guessed it, unintended consequences. The conversation started innocently enough. We were talking about an IT assessment engagement we were doing for a manufacturing firm here in town, and we were discussing various ideas the company could take towards improving. One suggestion concerned me a little bit, and I said something about how this could easily spiral out of control and create an entirely-unintended negative effect. Trying to justify my view, I cited the Law of Unintended Consequences, which thus prompted his question.

Now, to be fair, my business partner isn't stupid - quite the opposite, in fact. No sooner had he heard the first thirty seconds of my explanation of said law when he said, “Oh, you mean systems thinking!”

Theory…

Systems thinking is, in many ways, the “formal” end of the Law of Unintended Consequences, and far more fully fleshed out. The Law of Unintended Consequences is a folk adage (at least, according to Wikipedia) first introduced to the world in 1936 by sociologist Robert K Merton, but systems thinking emerged as a management concept in the late-90s. The Law of Unintended Consequences states that any action can potentially result in three categories of unintended results: an unexpected benefit (often characterized as “luck,” a “windfall,” or “serendipity,” at least to those who are honest enough to admit that they hadn't intended that positive side-effect), an unexpected drawback (a side effect that doesn't invalidate the intended result, but is clearly negative), or a perverse drawback (the exact opposite of the intended effect - also known as a “backfire”).

Systems thinking is much more subtle. It emerged out of the “system dynamics” work done by a (what else?) computer engineer in the late 50s. That engineer, Jay Forrester, wrote in 1998 that “The concept of a system contradicts the belief that people are entirely free agents.” In essence, that “linear thinking” - the idea that if you do A to B, the result will be C - is flawed. Instead, doing A to B touches D and E, which in turn creates results G, J, Q and Z… except that Z might not show up for days, weeks, months, or even years. No man - no entity, which in our world can easily be extended to computer programs - is an island, and touching one entity in turn touches everything that entity touches. Instead, where linear thinking sees everything as “lines” leading directly to a result, systems thinking sees everything in terms of circles: push on one, and it loops around back, potentially pushing on circles it touches, all of which loop around as well, creating feedback. The system links everything - people, institutions, processes, programs, everything - but the entities themselves are not the point, per se. The point is that they are interconnected, and that they influence each other in subtle and sometimes unpredictable ways.

Another systems thinker, Peter Senge, has used systems thinking to examine the roots of the “war on terror” and says the causes lie not in rival or competing ideologies, but in a way of thinking shared by both sides. Consider: The US establishment, using linear thinking, sees terrorist attacks as a threat to American citizens, and when a threat to American citizens emerges, the proper response is to engage the military machine and go after the perpetrators. Similarly, terrorist thinking says that US military activity is US aggressiveness in regions where they're not wanted, which in turn creates resentment against the US, which in turn motivates them to join the cause against the US.

Whether we agree with the analysis or not, the role of the systems thinking is clear: each of these “straight lines” (you threaten me with violence, I react with violence until you stop threatening me) is in fact a circle, a system of actions, reactions, and variables that influence each other, with the net result being the opposite of the intended effect. Each side behaves in the obvious manner, reacting based on the threats presented to them, but their combined actions don't reduce the threat and instead create an ever-escalating series of threats that - again, based on the obvious reaction - yield even worse results than before the action. In essence, doing the obvious thing does not produce the obvious, desired outcome.

In system terms, each action produces some amount of feedback - not customer feedback, but a flow of influence that happens between every player in the system. Every influence is both cause and effect, eventually creating a loop that reflects back upon the actor that kicked off that influence in the first place. The feedback comes in two forms: reinforcing feedback, which amplifies the results, and balancing feedback, which seeks to balance out the system. Sometimes systems thinkers use the terms “positive feedback” and “negative feedback,” which, despite the connotations of those two terms, refers to whether or not the feedback increases the influence on the actor or decreases it.

Consider a classic example: A bully picking on another child receives positive feedback if his intended victim gets into trouble for reacting to the bullying, and negative feedback if the bully himself is hauled into detention or gets punched out. Positive feedback increases the likelihood the bully will prey again and negative feedback reduces it. But, in the classic systems thinking angle, if a third party (the “hero”) does the punching (literally or figuratively), it may yield positive feedback; the hero thinks he has spared the victim, but now the bully realizes that the victim won't defend himself on his own, which means the victim is fair game when the hero's not around. Despite the hero's best intentions, the bullying gets worse. (This explains why many bullying victims seldom go to authority figures for help.) It's the Law of Unintended Consequences at its finest; systems thinking in its simplest.

In many ways, systems thinking is reminiscent of chaos theory, as explained in Jurassic Park by Ian Malcolm (Jeff Goldblum's character), the mathematician: “It deals with unpredictability in complex systems…. [Circumstances] never repeat, and vastly affect the outcome.” (For a reminder of the scene, check out https://www.youtube.com/watch?v=n-mpifTiPV4 to see the clip.) It's not the same and chaos theory is vastly more complicated.

...and Practice

It doesn't take much of a stretch to see how systems thinking affects software systems. In fact, in 1975, Gerald Weinburg (of “egoless programming” fame, among other things) published a book on systems thinking, a quarter-century ahead of the business types' “discovery” of the concept. Unfortunately for us, his treatment of systems thinking wasn't quite what Forrester was talking about. Weinburg was talking more about “the general systems view of science” (Chapter 1), and as a result, what would seem to be a great example of software engineering having the jump on the rest of the world (e.g., management), but in fact, isn't. (To be fair, some of the basic principles Weinburg uses are applicable, but the deeper you get into the book, the more it becomes apparent that he's systemically trying to describe different sciences, not examine the core of systemic thinking in itself.)

Fortunately, it doesn't take too long to find examples of systems thinking in software. Almost every software developer, regardless of skill level or years of experience, has a story that fits the bill here. Sometimes you write it off as “yak shaving” - any activity which, on its own, stands as useless, but is seen as a necessary step to accomplish some larger task. Consider the ubiquitous upgrade statement: “I'll just upgrade my version of the compiler” that “shouldn't take more than fifteen minutes.” Four days later, after upgrading the compiler, you also need to upgrade the libraries that the compiler produces code to use, the version of the libraries used by the libraries and adding in the new libraries that are suddenly dependencies, and then upgrading the operating system itself because the version is one or two steps out-of-date with the new libraries that are suddenly dependencies.

In fact, some software systems have all but given up on this particular slice of systems thinking, preferring to avoid version hell by simply making redundant copies of anything that might ever be a dependency everywhere it might be needed. The NodeJS Package Manager (npm), for example, simply copies every dependency needed by a library into a subdirectory, and then recursively applies the same algorithm to every dependency needed by that library. No attempt to reconcile the various libraries and versions - even if they're exactly the same - takes place. After all, disk space is cheap.

Unfortunately, not all solutions are so easily obtained. Management needs to wrestle with similar kinds of decisions every day: If I have an underperforming team, I need to do something. But what? Removing an individual from the team might remove somebody who's talking a lot and potentially distracting the team, but that might also be the person who is (accidentally, perhaps) serving as the communication hub for the entire team so that everyone has a clear picture of what the team is doing. Removing the individual who doesn't talk much might help with the team's communication, but then you're removing the one to whom everybody turns when it comes time for a thoughtful discussion or inspection of the problems. If you remove the one who's always on the Web, you're removing the one who's keeping the team abreast of what's going on “out there” and thus keeping them informed about new ideas or patterns or approaches. And of course, if you do nothing, you (presumably) continue down your current course, already identified as underperforming in the first place, which potentially either lets the problem solve itself, or communicates to the team that you are management and just don't care.

The same rings true with adopting new technology. Yes, NodeJS is the current “cool kid” of the Web world, but what does adopting it into my data center (or my cloud!) do to the rest of the systems? Will I be able to find developers to work on it in the years to come? Numerous companies “bet the farm” on Ruby on Rails, back during the Rails heyday, and then suddenly found they couldn't find anybody to work on their code - not because Rails was a niche technology, but because it became the hot property of the startup community, and those larger firms who bet on Rails were suddenly faced with a shortage of Rails developers. (One company I know of - a former startup that has since became an entrenched and established brand in the travel industry - ended up buying a Rails consultancy shop wholesale, fired all their clients, and turned the staff loose on their internal Rails projects.)

And if you think software is any simpler at the enterprise level, let's talk about how to improve a software system for a while. Faster computers? More computers? Cache? No cache? Java? .NET? Native code? Relational database? Non-relational database? The answers are numerous - but if you scratch too far below the surface, you'll find that most answers are entirely contextual, and for that reason, almost always entirely unrepeatable in a different context.

...and Summary

Systems thinking has always been with us; in 1758, Benjamin Franklin wrote this child's nursery rhyme in his Poor Richard's Almanack:

For want of a nail the shoe was lost,

For want of a shoe the horse was lost,

and for want of a horse the rider was lost,

being overtaken and slain by the enemy,

all for the want of a horse-shoe nail.

You may be surprised that you're already reading the end of this column - I've cited a number of issues, but offered no solutions. That's not a mistake - I don't have a simple solution. Systems thinking is a mode of thinking and there's no simple solution. Examining a situation and being able to see the different circles and loops of influence isn't easy and takes practice. It takes time, patience, and a willingness to do nothing but watch as situations play themselves out. In a world where action is preferred to inaction, and opinions - no matter how well or poorly thought out - are seen as superior to silence, investing the time in trying to see the systems can often feel like a waste. It's a journey and it's one with which I find myself constantly wrestling.