My colleague and I were four days into a field exercise for an environmental impact statement or EIS. An EIS is a document every public project must complete to show what the short and long-term impacts would be if it were completed. They tend to be large and expensive. This particular project was very large and the impacts many.
Given that the project was underway and had a definite deadline, no one was willing to slow it down or change its scope. This particular task was supposed to take the two of us about a week to complete.
Four days into this particular task, we found the data we had been given was poor, the people to contact with better data were unavailable, and that the massaging of the new data we couldn’t even get would take weeks.
When we presented the managers of the project with this information, we were told Don’t Slow Down, Just Get It Done.
Get it done with what?
The data you already have.
We were then expressly told not to contact anyone else or extend this particular task.
Back then, we were completely confused as to why anyone would be so totally stupid. How could these managers knowingly ignore that we were using faulty information that would result in a useless report? These weren’t Dilbert characters, they were real live human beings.
Over the years I would watch other seemingly smart people drive their high-tech startups into the ground by insisting that an initial plan or idea go forward despite ample evidence that all that lie ahead was ruin.
What cognitive biases kicked-in to destroy these endeavors?
Irrational Escalation – This is in-for-a-penny-in-for-a-pound-ism. Once people start on a project, they find it very hard to stop – even if the project is obviously doomed or the return will not cover future costs. The managers felt that the project was already underway and even using bad data was better than delaying the project and asking for new data. The bad data (not surprisingly) later resulted in obviously flawed analysis – causing even more expensive and stressful work later in the project.
Expectation Bias – This is a world-view bias. If we believe something, we pick data points that support that expectation. Further, we expect that most data will support our expectations. We had an opportunity to create a research regimen that would yield scientific results. The managers assumed this was all a formality and that any data (good or bad) would support the original plan. This was likely underscored by “experience” that showed that most data did indeed support the original plans (which were likely suffering from expectation bias).
Planning Fallacy – A central issue here is simply that we didn’t have enough time to finish our task. The expectation bias inherent in the people who wrote the initial scope gave us only a week. This led to them, and us, falling prey to the Planning Fallacy which shows that human beings have a strong tendency to underestimate task completion times. On the ground, we saw this playing out. From the manager’s desk, the plan was correct, we were at fault.
Status Quo Bias – This bias says that once we start something, we tend to want it to continue. This is why in Collaborative Management we seek to build an explicit Culture of Continuous Improvement for teams and companies. Making change and improvement the center of the culture makes change the status quo. Needless to say, this wasn’t the case with my project back then.
The upshot here is that biases can gang up on us. Our enthusiasm, belief in ourselves, and assumption that things stay pretty much the same can combine to create seemingly logical decisions that can turn deadly. My managers no doubt reasoned that the project plan took over a year to write by well-informed professionals. There was very little cause for concern that a small part of the project might derail the whole thing.
In the end, our re-work was hardly noticed because other parts of the project became late, with much higher costs. One can only guess that the same forces were at work.