Mini-post: Near misses & disaster

This is an interesting article from Catherine Tinsley, Robin Dillon & Peter Madsen. I’ve posted a heap of their research.

Their work strongly flies in the face of conventional wisdom of tracking near misses. That is, rather than near misses being clear warning signs in advance of major events just waiting to be found and heeded, rather, they can frequently be interpreted as signs of success (that systems and controls are working).

In fact, a number of their studies found that including near miss information actually increased risky decisions in organisations rather than reduced risky decisions (because people interpreted near misses as near successes); this finding was consistent even when chance/luck was the only difference between scenarios.

Aptly, they note that “near-misses [masquerade] as successes” and “Recognizing and learning from near misses isn’t simply a matter of paying attention; it actually runs contrary to human nature”.

Two factors, among many, said to contribute to this effect is:

1) the normalisation of deviance, where over time & very gradually elevated risks and anomalies are considered to be the norm because they don’t result in overt failures

2) outcome bias, where people pay attention and reward successful outcomes rather than focusing “on the (often unseen) complex processes that led to [those outcomes]”.

Link: https://hbr.org/2011/04/how-to-avoid-catastrophe

Link to the LinkedIn post: https://www.linkedin.com/posts/benhutchinson2_how-to-avoid-catastrophe-activity-6892592292524376064-RSbc?utm_source=share&utm_medium=member_desktop

Leave a comment