To a man with a hammer, everything looks like a nail.
~ Mark Twain
If you only have a hammer, you tend to see every problem as a nail.
~ Abraham Maslow
Some quotes become lore. Some quotes become cliché.
These quotes are a warning we all ignore, and may well be another cognitive bias.
We can call it Nail Bias.
We have a tool that we are fond of, or worse yet hate and have invested tremendous money in, and we are anxious to find a use for it. Or, even worse yet, we read an article about some management consultant who has invented an awesome hammer, and we declare ourselves to be nails.
Whack us.
We’ve discussed Expectation Bias, where we interpret information in the light most supportive to our original assumptions. This is also called “Experimenter’s Bias” because scientists (even us amateur scientists) tend to set up experiments that will provide results favorable to their hypothesis.
I have had clients become restless, when I will take them on avenues of discussion that obviously depart from why they hired me. They want kanban. The want personal kanban. Why am I not talking about KANBAN?!
The answer is that kanban is not the solution for every problem and over using the tool will greatly lower its effectiveness. My clients had an expectation they they were going to get a box of kanban upon my arrival, we’d open it, an awesome workflow and team dynamic visualization would float out, and everyone’s work would be done.
That was their hypothesis and when the conclusion started to differ, it caused cognitive dissonance – the slow painful realization that the world is out of sync with your world-view. Nail bias begins to become clear. We had a nail, we set up a kanban, but, rather than being a hammer, it was a flashlight. We could see more and now knew those weren’t nails at all, but screws, rivets, nails, thumb tacks, and a wide range of other unexpected things.
Suddenly, Jim is talking about all these other things. Not nails.
We need the limits of our bounded rationality to become clearer. Bounded rationality recognizes that we only have so much time to recognize a problem, process it against our histories, learn more, hypothesize a solution, and act.
We will always be acting on limited information.
This is inevitable. No one thinks (or no one should think, anyway) that they know everything.
The problem is that since we don’t know what we don’t know, we can’t act on not knowing it. We must assume at some point that we have enough information to act. That assumption is driven by several things including deadlines (a major cause of quality problems), politics, and fear.
This leads us to simplify our choices. We have to create a “short list” of options to choose from because the danger of over-analysis is right around the corner. At this point we walk a fine line between honestly limiting our choices so as to reach a coherent and rapid conclusion, and satisficing.
Satisficing is yet another element in this cognitive soup that makes things easier, more coherent, and less precise. Combining satisfy with suffice was a wise move by Herbert Simon when he came up with the concept in the 1950s. We, as decision makers, must run complex problems through both social and practical filters. We must find rapid solutions that give people what they want, and within reason.
But the target at the center of the of time-to-market, satisfaction, and practicality coordinates is hard to hit – and the bad news is we’re constantly aiming for it with many different concurrent decisions. We are all suffering from decision fatigue – the phenomenon where the more decisions we are presented with, the higher the mental and physical strain they cause. One only needs to watch the rapid aging of any US President after election to see that.
Today, we have many things that vie for our attention, our time, and our decision-making capabilities. Many of us have so many interruptions (each of which involves a decision whether or not to allow the interruption) that non-ADHD people actually begin to exhibit signs of ADHD.* We can no longer close our office door and concentrate because digital conversations do not respect masonry – they walk right in and chime!
When these factors combine with other biases like the availability heuristic (which we’ve discussed) and system justification (where people tend to justify existing systems rather than try to fix things that are broken) … we end up with Nail Bias. In an effort to quickly reach a politically satisfactory, rapid and practical conclusion, we will fall back on what is both recent in our memory and coordinate with existing systems.
In fact, that may well be the definition of the proverbial “box” we are trying to think outside of.
Nail Photo by Bob MacInnes (ironically showing several types of nails….)
1976 Picture of Jimmy Carter via Wikia.
*Source – The Scientific American A Day in the Life of Your Brain, Horstmann, 2009