This study explored how availability bias, outcome bias, and hindsight bias can influence pilots’ perceptions of past events, which in turn may affect their perception of events yet to occur. They also explored the influence of ‘close calls’ (near hits).
Two separate study protocols were run, with 142 pilots in protocol 1 and 62 in the second.
Providing background:
· Understanding how pilots decide on visual flight rules (VFR) when approaching adverse weather is important to study, given the potential of fatal consequences. Accident rates for VFR into adverse weather is higher than all general aviation accidents
· Heuristics are an important part of decision making and particularly when exhaustive evaluation of all relevant info isn’t possible. Most of the time, heuristics work well and allow people to make effective judgements swiftly
· However, “there can be a price to pay for using heuristics” and “Occasionally, they can lead to systematic errors (biases), which essentially deviate from what would reasonably be considered rational or good judgement” (p1125). These unintended consequences are amplified in uncertain situations, with ambiguous information, and under limited time to reach a decision –such as under VFR conditions
· Prior research has identified well-over 100 biases, and at least 40 relevant to clinical reasoning. Other research has linked VFR decision making under adverse weather to three biases: anchoring effect, confirmation bias, and outcome bias
· Pilots were found to anchor and under-adjust on initial pieces of information, to favour confirmatory evidence when testing evidences (rather than disconfirming hypotheses), and evaluate decisions based upon their outcomes rather than the quality of decisions at the time
· Availability bias is a tendency for judgements to be influenced by how easily an event can be recalled. Availability and outcome both influence how people process information about what has already happened (thereby influencing subsequent decisions)
· Availability bias is useful in reducing cognitive load, as things that come to mind more easily do tend to occur more frequently. However, “salient or dramatic events often come to mind more easily than do less dramatic events, which may bias an individual judgement toward a small and possibly atypical sample of events” (p1126). Availability bias has been shown to influence physician judgements in clinical reasoning
· Outcome bias is the tendency for people to judge the quality of decisions according to whether its outcome was good or bad. People “often overlook bad decisions where the outcomes have been relatively good (e.g., near misses) and focus instead on decisions where the outcomes have been bad [e.g., fatal accidents] … However, this may mean that lessons that could be learned from near misses or close calls may be lost to all except those directly responsible” (p1126)
· Hindsight bias, following on from an event, is where people claim that it would have been all too easy to predict the event in advance, but in reality “research consistently shows that people have a tendency to overestimate their likelihood of predicting an event” (p1130)
· Hindsight bias has faced significant research in the medical field where it’s been shown “to diminish the value of learning from medical cases, particularly those in which there may at the time have been considerable uncertainty as to the correct diagnoses” (p1130, bold added)
Results
Key findings included:
· Pilots were influenced by the outcome of a flight when judging decision quality, confirming the presence of outcome bias
· Pilots interpreted events that led to a close call very similarly to those that had positive outcomes “which may reinforce risky behaviour” (p1124)
· No evidence of availability bias was found, said to be a positive finding, but to be interpreted cautiously
· After having read a flight report, particularly if it ended in a crash, pilots consistently overestimated their likelihood of predicting the actual income – supporting the identification of hindsight bias and, importantly, possibly limiting any opportunity for learning
· “These findings suggest that two of the three cognitive biases explored in this study could influence a pilot’s perception of past events in ways that may adversely affect how they make future decisions that in turn may affect flight safety” (p1124)
Discussing the findings, they note how pilots interpreted close calls similarly to positive outcomes; limiting the learning from close calls as narrowly averting failure. Further supporting the evidence “that the severity of the outcome has considerable bearing on how a decision is judged” (p1130).
Given the component that luck can play in event outcomes, these findings “indicate that close-call lessons may not be taken as seriously as they warrant” (p1130).
No evidence for availability bias in this study was a positive finding, and especially in regard to the close-call groups as it could “lull pilots into a false perception and therefore encourage riskier decision making” (p1130).
However, they note caution should be applied here and provide three possible explanations for the lack of observable effect of availability bias.
1) Availability bias is more likely to occur when situations attract significant attention, e.g. a plane crash. In this study however, the outcomes (scenarios) may not have captured that level of attention, resulting in participants assessing the described events as less dramatic and salient
2) This study used a short exposure period, whereas other research has shown longer exposure periods may influence availability bias
Demographic factors of the participants largely had little effect on the findings. The only significant difference was when participants were grouped into two main experience groups (novice and expert). Not surprising, as experienced pilots have greater skill sets to operate near adverse weather conditions.
Hindsight bias was also observed, but only when pilots were exposed to some types of outcome information – e.g. safe, close call or crash outcomes.
They found that “The majority of participants who were informed that the flight had ended safely or that the flight ended in a crash demonstrated hindsight bias” (p1133). Thus, “When judging a flight, particularly if the flight ended in a crash, pilots consistently overestimated their likelihood of predicting the event” (p1133).
Interestingly, a decent proportion of pilots in the close-call group demonstrated ‘reverse hindsight’ (thinking they had originally assigned a lower probability to the event than they really had). The authors speculate why this may be: perhaps where “mental cogitation” influenced reverse hindsight, such that in the safe and crash outcome groups, people were likely to effortlessly form a mental picture of the outcome and find cues to support that outcome.
In contrast, those in the close-call group may have had more trouble associating the situation with relevant and salient cues.
Overall, they suggest that these findings, and particularly related to close-call events, “deal with it in a way that may limit any learning opportunities” (p1134).
Regarding steps to move forward – they highlight the inconsistent and patchy evidence around debiasing. While there is some empirical support for some debiasing measures, a range of other studies have shown debiasing “to be predominantly unsuccessful” (p1135).
Authors: Walmsley, S., & Gilbey, A. (2019). Applied cognitive psychology, 33(6), 1124-1136.
Study link: https://doi.org/10.1002/acp.3557
Link to the LinkedIn post: https://www.linkedin.com/pulse/investigating-role-availability-outcome-hindsight-bias-ben-hutchinson
One thought on “Understanding the past: Investigating the role of availability, outcome, and hindsight bias and close calls in visual pilots’ weather-related decision making”