Find out more about Biases
Representativeness bias is the tendency to look for patterns in situations, people, or objects to help us make decisions. Our brains try to eliminate uncertainty by finding observed or perceived similarities in our past experiences to inform the present. This mental shortcut is often helpful, but overreliance can lead to inaccuracies and errors in judgment.
In everyday life when faced with a new experience, representativeness bias often leads us to make assumptions about reality and possible outcomes based on only the past experiences while neglecting the mathematical probabilities. For example, if we see someone wearing business attire, carrying a briefcase, and texting on a cellphone and were asked to guess whether they work for a corporation or in a library, research has shown most people will say they work for a corporation. They simply match with a mental image we have of what people who work in corporations look like. However, the probability of what their profession actually is should not consider such subjective perceptions – and mathematically, both jobs are equally likely. Representativeness bias is a powerful mental shortcut when forced to make decisions with high levels of uncertainty, but overreliance on it can lead to problems.
Unsurprisingly, stereotyping is a key consequence of representativeness bias. It results from our expectations that someone will behave in a certain way, or a situation will unfold in a certain manner based on our own past experiences – when we don’t even know that specific person or situation. Not all stereotypes are born from direct experiences, for example we tend to have perceptions about societal norms, about age, gender or race, and even about organizations.
During incident investigations we need to be mindful not to stereotype. For example, if a person on site fits a particular mental image we have built from past experiences, anecdotal examples, or societal trends – we need to understand that this mental image is not objective evidence. This is not to say we would be wrong each and every time, but our knowledge can never be sufficient data to replace mathematical probabilities. We should not assume workers, managers, or even leaders share traits or characteristics because of their trade, background, age, or levels of experience in the industry. For example, not all older workers are ‘stuck in their ways’, and resist changing their work methods as a result of new safety protocols or interventions. We should try to treat everyone equally during the investigation process. It can be very hard to overcome stereotyping.
Many stereotypes are entrenched in society and constantly reinforced. Advertisements often enhance stereotypes, for example positioning women as homemakers and men as money earners in a family. They are often introduced when we’re young, making it harder to change our thinking when we become adults, although it’s not impossible. A good approach is to always be mindful you could be stereotyping and take that as your starting point. A robust technique to challenge stereotyping is to actively look for evidence that goes against your initial assumption or mental image. The investigation should reveal whether your initial assumption was accurate or not. This approach mirrors how any good scientific research is conducted, where you only consider a hypothesis correct if you cannot find any evidence to the contrary.
Although it is easier said than done, we must try to avoid generalizing – if you can generalize about someone, you are likely stereotyping them. Try to actively listen to the individual in front of you and what they are saying without also thinking: ‘oh, they are saying this because they’re X, Y or Z’. Investigation interviews are about getting good information only. Analysis of the how, what, and why should be done after all the interviews are completed to generate learning.
While our brains don’t like uncertainty and want to make sense of everything all at once, such tendencies can lead to overreliance on shortcuts like representativeness bias. Remind yourself that mathematical probability dictates that each incident is a new event, and although there can be patterns in the data from incidents across a company or industry, each incident is unique in reality and should be examined from a neutral starting point. You may end up with the same conclusions as your initial thoughts, but you will have arrived there honestly and not missed any contributory aspects or further information that could help the organization learn as much as it can from the incident.
Anchoring bias is the tendency to over-rely on or fixate on one piece of information amongst all others. It’s so-called because this bias makes people ‘anchor’ to certain things, and then compare all other information back to the anchor.
The anchor can be the first thing they heard about within a situation, which then influences subsequent lines of questioning and enquiry, or the very last thing (known as end-anchoring), which can result in the neglect of all the information that has come before. People can also anchor to something very familiar or very unique, and then use these anchors as the comparison point for all other subsequent evaluations.
In everyday life, we often become anchored to things when we make decisions. A common example is when an item is priced at $9.99. In such cases we anchor to the $9, which makes the item seem a whole dollar cheaper than if it was priced at $10.00. Studies have found that a significantly larger number of people will buy something priced at $9.99 than $10.00, due to anchoring bias.
Anchoring bias also becomes influential when we are discussing things with others, and we often anchor to the first piece of information we receive. First and last things we see are typically easier to remember, and it’s that ease of recall that gets conflated in our minds with the actual value of the information.
Within incident investigation, if you are initially told when you get to the jobsite that the project completion is fast approaching, and the workers are all under pressure to complete the work, this information can easily become an anchor, not least because ‘production pressures’ are a common causal factor in incidents. This anchor can interfere with the investigation process by influencing and directing the line of questioning, causing comparisons between the value of different evidence that cannot and should not be compared. The anchor can also be used to explain or justify any phenomenon, which overvalues its actual importance in the situation. The problem is not using information to inform your line of questioning or pursuing comparisons. It becomes a problem when the information becomes an anchor (i.e., an absolute truth) and is treated as a causal factor without sufficient evidence. Anchoring bias has the potential to limit information collection and thus organizational learning.
Like all biases, anchoring bias can be very difficult to overcome, even when you are aware of it. Experience can reduce the effect of anchoring bias, but even with considerable experience you can still be susceptible to its effects. Systematic approaches to investigation can help, for example having checklists to follow or a list of topics to explore in turn, to help ensure you give each new piece of information equal attention. Actively considering the opposite in any situation can also help, as this pivots your thinking away from the anchor which may direct you to asking very different questions and thus revel new insights on the situation.
Confirmation bias is the tendency to look for and focus on information that supports our existing beliefs or preconceptions.
In everyday life, this can be as simple as believing the referee made a bad decision when they called a foul against your team – but when they called the same foul against the opposition team in the same match it was a great decision! Another example often happens during election season, when we tend to seek out information that confirms our existing beliefs about the candidates. When we go on the internet and search, “why is person X wrong” or “why does Y soda taste so good”, guess what? We will find information that may even be credible supporting the presumptive answer to our question. This bias also consciously and subconsciously enables us to ignore or dismiss information that conflicts with our beliefs. All the while, we are comforted by the fact that we have done our research and found evidence to support our beliefs.
Within incident investigation, confirmation bias can lead us to seek out information that supports our beliefs about an incident – which themselves may have emerged as a result of other biases (check out representativeness bias and anchoring bias to see how they can influence our preconceptions about a situation). In an interview, confirmation bias can lead us to subconsciously ask questions that will generate information that we know will add to the evidence that supports the conclusions we have already made. For example, if you feel production pressures are a causal factor, you may focus on questions exploring stress, work pressures and deadlines – the answers to which will add to a more detailed picture of production pressures on the jobsite, and thus support your theory simply by adding to the body of information that backs it up. However, this is likely to also be at the neglect of other lines of questioning that may also be highly relevant.
Confirmation bias adds value and creditability to our pre-existing beliefs and thinking by encouraging a self-supporting loop of information. We think X happened, so we seek out information that X did happen – but in incident investigations this can be very limiting, and even damaging for those involved. This bias can be overcome by actively challenging yourself to seeking out information that contradicts your thinking and be open to different ideas and information. Even if you were right all along, at least you have been thorough in your investigation and explored all avenues of enquiry.
Recording in your final report how you rigorously investigated each relevant conclusion would also allow transparent communications. Another influence that can enhance confirmation bias is the need to be right about a situation. To overcome this, try to get comfortable with being wrong and don’t listen to your ego. Remember all investigations should be as rigorous as possible. The goals is to collect all the information surrounding an incident to enable robust analysis and organizational learning to occur. Being right from the start and being able to say ‘I told you so’ should not be part of the process at all!
Courtesy bias results in us telling people what they think we want to hear. During incident interviews this is an important bias to be aware of, as the interviewee may well be upset or nervous, which can enhance its influence.
In everyday life, courtesy bias can prevent us saying what we actually think, because we don’t want to be negative or offend someone. For example, if a meal in a restaurant doesn’t taste good, courtesy bias can mean we still insist everything is fine, to avoid getting the cook or server into any trouble.
During incident investigations courtesy bias can be very problematic. If an interviewee doesn’t want to say anything negative or is in fear of repercussions or blame from an incident (whether they had any accountability or not in the situation), then courtesy bias can mean they limit the information they share honestly. The interviewee may emphasize some things over others to avoid creating upset, being negative about the situation, or getting their co-workers or supervisors into trouble. Courtesy bias can significantly limit the collection of valid information, as it sugar-coats the truth and suggests lines of questioning away from sensitive topics.
A further complication can occur when the interviewer causes this bias in the interviewee. Leading questions grounded in the interviewer’s own biases – check out confirmation bias for how this can easily happen. The result can be a ‘theatrical bias performance’ instead of an interview: The interviewer is biased in their questioning and the interviewee is biased in their responses.
For example, if the interviewer fixates on production pressures, then the interviewee may emphasize those issues in their account – and avoid mentioning any management or resource issues that would contradict the interviewers line of questioning.
To overcome courtesy bias, interviewers should first be aware of their own biases and then carefully consider how they are asking their questions. A neutral and open discussion is essential, as if there is any fear or anxiety on the part of the interviewee, courtesy bias can easily result. Building rapport can help overcome this bias to encourage the interviewee to be honest and truthful about the situation to the extent they can be, and also avoid any additional fabrications the interviewee may add in simply to please the interviewer.
Fundamental Attribution Error
Fundamental Attribution Error (sometimes called FAE) is a bias through which we place more emphasis and value on personality than on situational or environmental factors when evaluating people’s behavior or what they say.
In everyday life FAE means we tend to think people are behaving in a certain way because of their personality traits. For example if someone is rushing down the street bumping into people without apologizing, we may think they’re very rude and arrogant. In fact, they may have just heard their child is sick at home and need to rush back to take care of them. Another example is when we message someone and don’t get a message back straight away. We can think this is because they don’t care about us, or are punishing us for something we don’t know we’ve done. In fact, they may simply be busy, or driving, or on a plane with their phone in flight mode and so couldn’t get back to us even if they wanted to.
In incident investigations this can also lead us to make assumptions about why people behaved as they did. For example, we could assume a worker took a shortcut because they’re lazy or simply lack attention to detail as a personality trait, when in fact they were rushing due to an upcoming deadline on the jobsite.
Being aware of FAE can help us overcome it. Try to consider why someone might be doing what they’re doing from all perspectives and actively include potential situational and environmental reasons in that thinking. Remind yourself that people do things for a wide variety of reasons, and although it’s often easier to blame them for inherent failings of personality, the reality is often much more complicated.
Conservatism in Belief Revision
A bias which means we often fail to revise or change our thinking when faced with new evident or information. It can limit the amount we value new information that challenges our currently held thoughts. It is linked to anchoring bias which can itself initially fix a piece of information in our minds, and then conservatism bias takes over and means we can struggle to change our minds even when all the information and evidence indicates we should do so.
In everyday life, conservatism bias can mean we don’t question our existing beliefs and thinking too much. Trusting our beliefs is not necessarily a bad thing, but only becomes problematic if we refuse to change them when we rationally should, for example if we are presented with new information. For example, we may believe that
In incident investigations, conservatism in belief revision can cause problems when interviewers struggle to change their thinking in the light of new information. This can result in sticking to redundant lines of questioning or failing to explore other avenues during the interview. For example, if we are certain that production pressures have led to the incident, conservatism in belief revision can mean we stick to this premise even when faced with evidence of poor training, a lack of supervision and a team of new workers struggling with a lack of resources. By sticking to our initial idea, this interview could result in a neglect of these other contributory factors and thus limit the potential organizational learning from the incident.
To overcome conservatism in belief revision, we should firstly acknowledge that it can and does occur. Being aware of the fact that you may be valuing one piece of information over another due to other biases (check our confirmation bias and representativeness bias which can both lead to this situation) or our own fundamental beliefs can help reduce its impact. Actively pause and reflect on any new information as it emerges from the investigative process, and remember that you can change your mind – it’s not about being right or being able to say, ‘I told you so.’ A mindful approach can help you evaluate new information more thoughtfully, rather than simply reject it outright, and can lead to enhanced organizational learning once the investigative process is complete.
Availability bias leads us to overestimate how common things are if they have greater ‘availability’ within our experiences and memories, with little or no regard of actual likelihood or significance. We tend to value information we can recall more easily. When faced with uncertainty, the easier it is to recall information the more we value it and inflate its importance.
In everyday life this is why when you hear about a plane crash that happened yesterday, you may worry more about getting on a plane for a trip today – even though statistically taking a plane is still much safer than driving. Recent events can influence this bias, as can events that are emotionally charged (for example a SIF incident) or things that are very unique or unusual.
During investigations, this can mean we jump to the wrong conclusions about why something happened as we consider the new incident. If we can easily recall something that happened in a previous investigation, for example that a worker had used a tool in a certain way that resulted in an incident, we tend to think that it must be important because we suddenly remembered it, whether it has much bearing on the current investigation or not. Availability bias can also mean we judge that information as equally if nor more important that other information or possibilities that can’t be recalled so easily.
Just because something has stuck with us and springs to mind during an investigation, it doesn’t necessarily mean it’s relevant or even important to the current situation. To overcome availability bias, we should treat any information we can easily recall with the same careful consideration and evaluation as if it came from someone else. Working with a diverse team can also help, as everyone has different stores of information and thus their availability bias will generate different results. Working with others can also help you check each other and ensure information is being incorporated into the investigation appropriately.