A foundation of good risk management is good decision making. Many, if not all, accident investigations reveal safety failures due to poor decisions making around risk. There are a variety of causes and underlying factors but the more common underlying factor is simply poor thinking.
Many organisations have robust risk assessment and control processes, however many fail to account for the fact that humans are not good at clear thinking. Many are under the illusion that safety decisions are reached by rational, balanced conclusions but in fact cognitive bias distorts our thinking process making us vulnerable to mistakes and poor decisions.
This blog examines a range of cognitive biases with reference to their effect on risk management and suggests some ‘de-biasing’ techniques to help improve safety decision making.
The core problem with with the current risk management process is that it fails to consider the cognitive and emotive effects of the individuals involved in the process that influence the decisions they make. Whilst most people think they have an objective viewpoint and make rational decisions, the opposite is true. Various studies confirm that people regularly make subjective, or biased, judgements, even when they think they are being objective.
Cognitive biases are departures from purely rational thought. They are system errors in thinking that prevent us being entirely rational. One common cause is complexity. The human mind is not equipped to deal with the sheer number of factors and their relationships in many risk situations found in the modern technologically-complex workplace. We commonly use heuristics (rule of thumb) to help assess complex risk situations.
Collecting Risk Information
Many of the biases likely to occur during risk information collection cause individuals to ignore relevant information and/or give too much importance to other information. They can lead to incorrect conclusions and perceiving patterns when none exists. Biases during information are outlined below:
- Information Bias – refers to the tendency to collect more and more information believing that more information enables better decisions and that lack of information opens people up to blame should something happen. More information generally leads to cognitive overload where critical information is omitted or there is a failure to assess it in depth.
- Availability Bias – occurs where judgements relating to the likelihood of an event is based on what is easily available from memory. Individuals make associations with emotive or recent memories that could affect decisions (e.g. a worker falls from a ladder so individuals are likely to interpret that falls from ladders are more likely).
- Pattern Recognition – is a condition of the human mind to make sense out of the works. We try to establish patterns that do not necessarily exist.