Decision Errors and Accidents – Applying Naturalistic Decision Making to Accident Investigations

This paper discussed two different accidents and contrasted their investigative approaches using the naturalistic decision-making (NDM) framework.

This is a really long summary given the richness of the paper. Hang with it to the end as I think the author has some interesting insights to share.

For background – even in dynamic and ill-structured situations, experienced decision makers can still quickly recognise and respond to situations; termed naturalistic decision making. One type of frame for decision making is termed the classical view. This sees decision making as people weighing up the pros and cons of different options. This may be more applicable for largely static situations with little direct harm (e.g. buying furniture).

It’s said to have little applicability to the dynamic and often rapidly changing situations facing people in complex and safety-critical industries. In these situations, NDM and recognition-primed decision making describe how people make decisions when facing “ill-structured situations; shifting, ill-defined, or competing goals; time pressures; and potentially severe consequences of “poor” decisions” (p2).

That is, rather than evaluate different options and weigh up pros/cons, they respond to cues in the environment and similarities to their own experience via pattern matching.

Some earlier research highlighted how mismatches with decisions resulted strongly from insufficient decision maker experience, insufficient information, or incomplete or inadequate mental simulation.

Other researchers suggested a mismatch with decision makers’ situation awareness; in this context, their response to the situations encountered; largely a factor of human information-processing limitations to ambiguous or changing cues.

It’s argued that “Accurate situation awareness is the first and arguably most critical step in NDM and therefore in effective decision making” (p3).

One type of mismatch is a plan continuation error. These describe “not uncommon” phenomena where people continue with a course of action in dynamic circumstances despite contrary information suggesting a different course of action is warranted.

Although experience helps to build the repertoire of patterns to draw on, it can also set people into a bind with an expectation of success even in the face of significant warning signs of harm. Plan continuation errors are difficult to reverse since this requires additional cognitive effort to recognise that the situation has changed since the initial decision was made and that a different course of action is needed.

Interestingly a similar bias was found in a study of criminal investigators who worked on multiple cases. Investigators “devoted more resources to “emotionally charged” cases than to [ostensibly] more routine ones” (p3). Further, when under time pressure, more experienced investigators allocated resources based either on their experience or using their first impression of the evidence (both instances of pattern matching).

Background on the evaluated accidents:

  1. Accident 1 related to a fishing vessel capsizing in rough waters. The investigators applied some of the NDM framework to learn about the incident – situating decisions within context.
  2. Accident 2 was the crashing of a passenger plane. As the authors highlight, the investigators described what decisions were or were apparently made by the pilots, but not really why and how those decisions came to make sense to the pilots.

In short, it’s said that while the aviation investigators “addressed the pilots’ deviation from the airline’s approach and landing procedures” they did not evaluate the pilots’ “decision to continue descending beyond the minimum descent altitude”. In contrast, the marine investigators explored the captain’s decision making.

Results from the investigation reports

The author discusses factors that influenced the captain’s decisions to cross the marina during rough conditions (I’ve skipped most of this). While waiting at the marina for weather to improve, some other boats successfully crossed the bar.

Further, the captain’s vessel had been loaded with passengers and taken them out some degree while waiting for conditions to improve. Had the captain turned back to dock, he would have had to explain to the passengers, who had been waiting some time for the expedition, that they had to return rather than cross the bar (which some other vessels had successfully navigated).

Compounding this factor was that the passengers and the captain knew each other; amplifying disappointment for the captain.

The investigators believe that the captain likely assessed the chance of crossing the bar successfully based on his prior experience; a similar situation he likely faced over many years.

Nevertheless, the other vessels that had crossed the bar during the thunderstorm were actually larger and more powerful vessels.

The investigators described in their report how the captain would have faced high stakes of a “wrong” decision, shifting or competing goals (as in the safety from returning to dock vs passenger satisfaction crossing the bar), and uncertainty in the dynamic sea state.

While waiting at dock, the captain continually monitored weather and bar conditions but there would still have been considerable unpredictably in forecasting what bar conditions would be like during the actual crossing.

The report discussed the levels of situation awareness and forecasting conditions based on pattern matching (and the dynamic and unpredictable nature).

The report noted that the captain’s “confidence in his ability to cross [the bar] successfully, accrued over many years, may have led him to minimize the hazards he faced” (p6).

Notably, pattern matching and experience during times of uncertainty “in many cases they will be correct, but not always” (p6).

Supporting the NDM framework, it’s said that the captain didn’t evaluate the costs and benefits of alternative options on crossing or waiting, but “rather responded to the circumstances he observed, based on his experience” (p6).

Use of NDM and other methods have given investigators the skills needed to better understand operator decisions in dynamic systems.

In contrast, investigators in the aviation accident “largely focused on operator action errors, without explaining them or examining the decisions that may have led to those actions” (p7).

However, the aviation accident occurred 30 years prior to the marine one and thus according to the author, the aviation investigators didn’t have the benefit of the more sophisticated knowledge of NDM and performance that the marine investigators did. [** from the NDM/RPD perspective this may be true, but I believe the HF/E *was* rich enough during the 1970s to better inform the investigation and especially if you look into the Francophone ergonomics sphere – not even including the work from Fitts & Jones’ work in 1947 aviation as highlighted by Sid Dekker.]

I stress this point because I’m not convinced the construction industry has particularly and systemically advanced much beyond that 1970s aviation accident report.

Of further interest is how investigators evaluate events. Quoting the paper, “Investigators do not apply empirical logic to their analyses and do not subject their conclusions to the hypothesis testing called for in empirical research” (p7).

That is, investigators typically use “legal logic of the preponderance of evidence” rather than “inferential statistics, to answer counterfactual questions that can explain error causation and the role of error in an accident” (p7). For instance, the author asks whether the operator would have continued their course of action had particular factors not preceded it?

He notes that while the marine investigators did apply inferential logics and counterfactual analytics for hypothesis testing, the aviation report was more of the legal logic. Again, I’m not convinced most investigations across industry (construction, oil & gas etc.) apply this same hypothesis testing and use of logics.

[Note, the author isn’t referring to the counterfactual reasoning traps as described by Johan Bergstrom, but rather in the form of upward and downward counterfactuals for exploring hypotheses.]

In wrapping up the summary, other notable things:

·      Research from NASA found plan continuation errors were understandable and logical since the cues obvious to investigators after the fact was not always clear to the decision maker before the event [e.g. issues of hindsight and outcome biases]

·      Both accidents highlight how people make decisions in dynamic situations but are sometimes unable to accurately project system status into the future

·      People in the heat of the moment when making sense of their situation and decisions, must both try to understand and respond to the situation (steer a plane) while also trying to cognitively diagnose the fault or issue and this may outpace their cognitive resources

·      He notes “The work of accident investigations and NDM research is, to a large extent, symbiotic” (p8)

Author: Strauch, B. (2016). Journal of Cognitive Engineering and Decision Making, 10(3), 281-290.

Study link: https://doi.org/10.1177%2F1555343416654629

Link to the LinkedIn article: https://www.linkedin.com/pulse/decision-errors-accidents-applying-naturalistic-ben-hutchinson

Leave a comment