In previous blogs, we've discussed how our intuitive system impacts our decision making. Cognitive biases such as the overconfidence effect, ostrich effect, availability heuristic, social proof, and many other heuristics impact decision making and our personal safety. Several blog readers have asked us how to avoid the potential safety negative outcomes of these mental “rules of thumb.”
Over the past several decades, safety professionals have done a tremendous job engineering systems that limit cognitive biases’ influence on decision making. Procedures such as JSAs and LOTO processes are two examples. Proper training and SOP checklist processes protect employees from making poor decisions while executing common tasks. However, it is the uncommon tasks that pose elevated risks.
Our decision making is most vulnerable to cognitive biases when confronted with complex or unique problems.
We define complex problems as those problems that have multiple solutions with uncertain outcomes. Our blog readers’ question is most relevant to situations where we are faced with complex problems. A person will take mental shortcuts when faced with a complex problem and limited time. It’s natural. So how do we interrupt these thinking patterns?
Consider research that has been done over the past 20 years related to business leaders’ strategic decision making. Cognitive psychologists and decision making experts have developed methods for reducing biases when making big, complex decisions. The planning fallacy is a well-known phenomenon that plagues complex projects and leads to poor organizational decisions. Organizations unknowingly encourage overly optimistic thinking in many fascinating ways. An optimistic bias can lead to some very poor business decisions and dreaded phrases like “delayed rollout” and “over budget”.
Decision making experts advise business leaders to ask themselves two questions before committing to action:
What future events could make this decision go wrong?
What would happen if it did go wrong?
In a safety context, overly optimistic decision making can lead to deadly consequences. The infamous BP Deepwater Horizon catastrophe is an excellent case study. The origin story of BP’s founder beating all odds and the company’s miraculous comeback in the 1990s contributed to a culture of hyper-optimistic decision making. Time and again decision makers focused on the upside of every decision and failed to consider potential negative outcomes. They did not ask themselves these two questions.
Asking these two questions is a very simple and practical way to reduce one’s personal exposure to risk, particularly when the “safest choice” is not always clear. By asking these two questions before making decisions or performing work that involve hazards and risks, safety leaders or employees in the field can put themselves in a position to make better, safer decisions. Once it becomes a habit, it is a quick and easy way to continuously fight against all the cognitive biases that come with being human. So, when you or a co-worker is faced with a complex or unique problem that involves safety risks, just ask these two questions before committing to any action.