A good story, especially one that strikes home, can be particularly disarming. You are drawn into it. You feel its inherent truth. A truth that may or may not be real. But we feel it is real and we follow it quickly and with abandon.
We love a good story
Systems Theory is Awesome Because it Gives Us Great Stories
In The Silence of the Lambs, Hannibal Lechter controlled conversations by using the power of story and insight. When we combine a good narrative with an innate fear and the glimmer of salvation, we can win arguments, engage teams, and even win elections.
Stories become the groundwork for positive change. Without stories, we would have no work flow diagrams, no value stream mapping, no Personal Kanban, no relevance. Stories are the most accepted, highest value business communication tool there is.
Systems thinking builds powerful process and change on accepted stories about how we work, how we’d like to work, and the challenges of getting from one to the other. It gives us a forum and formats to discuss them and it comes with tools to help us find the subtext and backstories that make the stories work.
Systems Theory is a Trap Because it Can Give Us Easy Stories
In The Silence of the Lambs, Hannibal Lechter controlled conversations by using the power of story and insight. When we combine a good narrative with an innate fear and the glimmer of salvation, we can win arguments, engage teams, and even win elections. The problem is, the story doesn’t always need to be relevant, just plausible.
An example of this might be Jim Collins and his team of researchers who wrote the bestseller “Good to Great”. Good to Great is, itself, an excellent book that gives food for thought around willingness to radically change in the face of adversity and how that might lead to corporate success. We highly recommend the book for examples of companies that were willing to use their might and more than a little ingenuity to survive certain economic destruction
However, Good to Great spawned a generation of business owners who took it to be a recipe book - a virtual guarantee for business success. They didn’t quite understand the nuances that lay underneath the stories. The stories were engaging and plausible, therefore they must have repeatable (copyable) wisdom.
The implications were that if you acted like the companies described in the book, you would become “Great”. What people neglected to realize was that other companies tried the same innovative techniques as the companies in the book, and failed. The companies in the books also have had a hard time weathering the latest economic storm. So, while Collins’ intent was to provide readers with a systems thinking approach to building a great company, the book ends up being interpreted by readers as a how-to manual.
Daniel Kahneman in his book “Thinking Fast and Slow” notes that we as people really like narratives. We love a good story. So, therefore we end up giving incredible weight to a good story. So much so, that we believe the story is likely simply because it is believable. Good to Great is filled with excellent stories that, taken in the aggregate, seem to ensure that engaging in those behaviors will result in assured success. The stories are plausible, so plausible in fact so as to convince millions.
Probable? Not so much.
Systems thinkers can easily get caught on the wrong side of the plausible / probable divide. We describe systems and then begin to act on them as if the systems are real. But we don’t know that for sure. Every system we devise is an hypothesis. It needs to be described, observed, proven and then reproven over and over again. Why? Because not only does business context change, but the systems themselves are part of a nested series of other systems that directly or indirectly influence the system you’ve put in place.
By now, exasperated good systems thinkers are saying that this endless questioning is the very heart of systems thinking and that there are safeguards in place to protect against being overly focused on a narrow or erroneous view. Unfortunately, in the real world our cognitive biases tend to team up to make this purist application of systems thinking little more than an ideal. If we are able to keep these biases in check - that’s wonderful. But it is unlikely. And as good systems thinkers we should at the very least recognize that our own biases influence our systems.
My friend Simon Bennett has a story he likes to tell about working security for the military. In the 90s, they had put new computers in tanks and these computers had hard drives. The hard drives were having some problems, so Simon and his crew pulled them from the tanks and sent them back to the manufacturer to see what they could do.
Mind you, these tanks take some serious jolts which are very hard on hard drives.
The hard drive manufacturer, when they received the hard drives could not believe the amount of damage the drives had taken. They tried a few variations of hard drive design, but kept receiving back horribly damaged drives from Simon’s team.
Now, when you have a storage device that has ever touched classified information, it gets marked with a physical label that reads DIRTY. Dirty equipment needs to be handled and shipped in very specific ways.
After six months of utter confusion, Simon and his team found out that the drives they were sending back to the manufacturer were going through a team that specialized in the shipment of dirty equipment. When this team saw that the drives were going out to an unsecure location, they opened up the drives and took a chisel and destroyed the surface of the hard drive to make it unreadable. Then they put the hard drive back together, packaged it neatly in bubble wrap (so it wouldn’t get hurt) and shipped it on its way.
So, when the manufacturer received it, they’d open the drive and find a completely destroyed hard drive. Then they’d call Simon and his team and say “WOW! This thing is obliterated! What happened?” And Simon would say, “Well, it was just used for a few weeks in the tank.”
Simon and the manufacturer had a system in place that they thought was working for them. They were completely unaware that a second system was actively working against them.
While this is a rather extreme example (and certainly a plausible story), my point here is that when we get comfortable as systems thinkers with the systems we work in, we can overlook additional systems. Indeed, we may well have described a system so plausible as to be defensible for quite some time - until one day it inexplicably fails us.
Photo by Tonianne
Collaboration from Tonianne and Jabe Bloom