A foundation of good risk management is good decision making. Many, if not all, accident investigations reveal safety failures due to poor decisions making around risk. There are a variety of causes and underlying factors but the more common underlying factor is simply poor thinking.
Many organisations have robust risk assessment and control processes, however many fail to account for the fact that humans are not good at clear thinking. Many are under the illusion that safety decisions are reached by rational, balanced conclusions but in fact cognitive bias distorts our thinking process making us vulnerable to mistakes and poor decisions.
This blog examines a range of cognitive biases with reference to their effect on risk management and suggests some ‘de-biasing’ techniques to help improve safety decision making.
The core problem with with the current risk management process is that it fails to consider the cognitive and emotive effects of the individuals involved in the process that influence the decisions they make. Whilst most people think they have an objective viewpoint and make rational decisions, the opposite is true. Various studies confirm that people regularly make subjective, or biased, judgements, even when they think they are being objective.
Cognitive biases are departures from purely rational thought. They are system errors in thinking that prevent us being entirely rational. One common cause is complexity. The human mind is not equipped to deal with the sheer number of factors and their relationships in many risk situations found in the modern technologically-complex workplace. We commonly use heuristics (rule of thumb) to help assess complex risk situations.
Collecting Risk Information
Many of the biases likely to occur during risk information collection cause individuals to ignore relevant information and/or give too much importance to other information. They can lead to incorrect conclusions and perceiving patterns when none exists. Biases during information are outlined below:
- Information Bias – refers to the tendency to collect more and more information believing that more information enables better decisions and that lack of information opens people up to blame should something happen. More information generally leads to cognitive overload where critical information is omitted or there is a failure to assess it in depth.
- Availability Bias – occurs where judgements relating to the likelihood of an event is based on what is easily available from memory. Individuals make associations with emotive or recent memories that could affect decisions (e.g. a worker falls from a ladder so individuals are likely to interpret that falls from ladders are more likely).
- Pattern Recognition – is a condition of the human mind to make sense out of the works. We try to establish patterns that do not necessarily exist.
Analysing Information
Once collected data needs analysis. There are many biases that cause us to neglect, modify and/or distort collected information without realising it:
- Confirmation Bias – is the tendency to seek out information that confirms existing beliefs and expectations. People will try to confirm their beliefs rather than disprove what they believe in.
- My-Side Bias – is the assumption that others share the same thoughts, beliefs, values or positions. People will tend to side with their own, despite trying to be rational and balanced.
- Over-Confidence – is a bias that is a predictable cognitive characteristic that often influences decisions. People can be demonstrably over-confident and not aware of the extent to which influences their decisions.
Deciding
Decision making is by definition the selection of the best from a range of alternatives. Two biases that influence decision making are:
- Groupthink – is the tendency for individuals to allow their thinking to be influenced by members of a group to which they belong. The identification with the group can lead to a tendency to repress different opinions and limit the challenge to information presented to the group.
- Framing Bias – derives from how a question or problem is phrased. Very different responses can result from wording a question or information requirement. This can have the affect of reinforcing a particular course of action.
Acting
Historically many disasters have had a strong component of persisting when it was obvious an accident was about to unfold. Cognitive biases affecting actions are:
- Sunk Cost Effect – is manifested by a greater tendency to continue an action once an investment in money, effort, time or other resources has been made. Even if a new course of action would result in a better outcome there is a desire to continue with the existing course so that the effort/resources expended has not been a ‘waste’.
- Illusion of Control – there remains a strong tendency to believe that all aspects of the safety environment can be controlled. This leads to an under-estimation of risks and the failure to properly cater for “shark events”, which are events that come as a surprise and have a major effect.
Mitigating Bias
How to we ‘de-bias’ risk assessment and control? The answer is not easy or straightforward. Studies have shown that even individuals made aware of a bias typically revert to biased actions. At least an awareness gives people an understanding to reflect more deeply on issues and be critical of their thought processes.
Generally four strategies are suggested (which are already in use in most workplaces to varying degrees). These strategies are:
- Understanding and recognising bias
- Promoting (and rewarding) lateral and creative thinking about risk and safety
- Ensuring critical thinking
- Enhancing diversity
Conclusion
The awareness of cognitive bias is a key step in countering their impact on safety and risk management. Awareness needs to be backed up by checks and balances to ensure that effects of biases are limited and not permitted to impede the risk management process undetected. Reducing the negative effects of cognitive bias is an opportunity to improve risk management decision making and avoid some of the mistakes of the past.
If you would like to know more or would like our assistance in the areas mentioned check out our risk assessment group training.
If you would like to know more or would like our assistance in the areas mentioned check us out at www.intrinsicsafety.com.au. Alternately, call us on 1300 990 336 or email us at [email protected]
Brendan Day
Chief Executive Officer
Brendan Day, based in Sydney, is a WHS and Emergency Management expert with a rich background in emergency services, including significant experience as a military firefighter, emergency responder, and emergency response manager. His career spans across both public and private sector roles, where he has developed and implemented comprehensive WHS management and Emergency Management systems. As the CEO and Principle Trainer at Intrinsic Safety, Brendan combines his military discipline with modern safety practices, offering advanced training in workplace health, fire safety, confined spaces, height safety and first aid. His qualifications, including a Diploma of Work Health and Safety, reflect his commitment to safety excellence and continuous improvement in emergency response management and safety practices.