Counteracting the Cultural Causes of Disaster


This 1999 article from Andrew Hopkins talks about the cultural factors that helped to incubate the 1994 Moura mine disaster in Australia.

He starts by saying that Turner’s work demonstrated that “all socio-technical disasters involve an information or communication failure of some kind, in that they are preceded by a series of `discrepant events’ which are ignored or discounted”.

Culture is frequently given as a reason why people don’t acquire and use info in advance of a major accident to avert the disaster.

Vaughan’s work highlighted that organisations can “develop distinctive cultures or ways of seeing the world which are simultaneously ways of not seeing”. Gheradi referred to organisational culture as a “cause of blindness and as the origin of numerous failures of foresight”, as did Pidgeon who talked about cultures as “institutional vulnerability”.

Approaches that involve trying to get people to be more aware or alert to safety, like as a “constant exhortation”, “ignores the organisational sources of safety and the role which particular organisational arrangements play in generating ideas and action conducive to safety”.

Hence, if culture is to be useful for this context, it needs to be understand in terms of the organisational systems rather than just what is in the heads of workers. Hence, “A safety culture understood in these terms is not brought about by a focus on changing the individual workers but, rather, by changing systems”.

The Moura mine disaster in 1994 resulted from a methane explosion. Hopkins argues that the disaster incubated due to discrepant cultures and belief systems which, in a sense, inoculated a sense of vulnerability.

Coal tends to heat up slowly when exposed to air and unless it’s well ventilated, the temperature will rise. This is called a heating, or spontaneous combustion. The temperature may eventually reach the ignition temp of methane, hence exploding.

The Hierarchy of Knowledge: The Primacy of Personal Experience

The first cultural factor related to the value placed in personal experience. Hopkins argues that, simply put for argument, knowledge is acquired in four ways (recognising he’s omitted a lot here): personal experience, oral communication, and written communication.

At Moura “Knowledge based on personal experience was by far the most influential: it was both better remembered and judged to be more reliable than information gained in any other way”.

Word of mouth was also more influential than written communication.

When a CO reading of 19 L/p/min was obtained and communicated to senior staff, the response wasn’t to check the prior records but for somebody to go down and personally check (personal experience). They did not observe high CO readings or the telltale smells. They also didn’t ask anybody else who was present in the mine.

“Rather, the response was to personally `check’ on the matter, to see for oneself, by doing another measurement”. Nothing was found to substantiate the reports of high CO readings.

When a supervisor also reported a benzene smell, both orally and in writing, the ventilation officer went and checked for himself; he didn’t observe the smell. He judged the report to have been in error (again, personal experience).

The priority given to personal experience is said not to have been just an unconscious tendency but documented. The below shows the line of questioning from the inquiry:

Hopkins argues that this “philosophy proved to be one of the most serious safety hazards at Moura. Its flaw is obvious. Where warning signs are fleeting and intermittent, as is the case with many of the indicators of spontaneous combustion, it is likely that signs detected by one observer will not be present when a second goes down to `check’”.

The Priority of Oral over Written Communication

The second part of the hierarchy of knowledge belief system also nullified warning signs. It placed priority of oral over written communication forms.

Hence, if something was to be taken seriously, it had to be communicated orally, like during handover meetings. Written reports were often not read by managers and there was no system for extracting key info from reports.

Hence, “a safety matter which was missed at the time it was first reported, effectively disappeared without trace under a pile of paper”. If something was written but not orally communicated then it could be dismissed as unimportant.

Oral reporting can be unreliable though, and this was shown during the inquiry. People claimed to have made oral reports to their superiors, which superiors denied.

Since oral reporting had priority over written, an assumption is that the changeover/handover meetings between shifts would have formally ensured that critical info was discussed, but this didn’t formally happen. Rather, “miners were expected to brief each other during the `hot seat changeover’”.

While the hierarchy of knowledge is said to be largely psychological (we prefer to check for ourselves, and then oral over written), at Moura it went beyond this and was an “conscious set of beliefs about the weight to be given to different types of information”.

The Tendency to Discount Unwanted Evidence: The Culture of Denial

Hopkins talks about how “new evidence appears reliable and informative if it is consistent with one’s initial beliefs; contrary evidence tends to be dismissed as unreliable, erroneous or unrepresentative”.

If evidence runs contrary to our beliefs, then it places people in a state of cognitive dissonance – said to be an “unpleasant state”. Cognitive dissonance is resolved by either adjusting the belief or plausibility of the evidence. As expected, “Where the belief is strong, it is the evidence which is adjusted”.

There was a belief at Moura that there was no significant risk of spontaneous combustion. Challenging this belief would have involved disruption to the production schedule (during an already pressured situation).

Therefore, this was “clearly a situation in which one would expect to find a strong tendency to dismiss any contrary evidence”.

There was also a belief about an incubation period before spontaneous combustion could occur, believed to be ~ 6 months. The source of this myth was another mine explosion.

A Different Mining Method: `Normalising’ the Evidence

There was also belief system in play that normalised contrary evidence. After some time the CO levels had reached >14 L/min, which didn’t result in any reconsideration of the risk. This is because the rate was slow and steady.

The slow and steady rise “served to rationalise or normalise the high readings which were being obtained”. There was a belief that the rise needed to be rapid to signify immediate danger. This was a mistaken belief.

Vaughan’s work on the Challenger disaster was similar: frequent performance issues of the O-rings were observed in prior launches during low temps. But they hadn’t “failed totally”, so over time, the malfunctioning was “reconceptualised as normal and the risk of total failure came to be judged acceptably low”. Vaughan called this the normalisation of deviance.

Hence, unless there was an exponential rise in CO, people were not concerned; instead believing a slow and steady rise to be normal. Hopkins says this thinking “was totally confused”. He notes that while exponential rises are associated with heatings, by this stage it’s too late.

Hence, “An exponential rise is not a warning of potential danger; it is an indicator that a fire is already raging”. Heatings can also occur without exponential rises, as at Moura.

The Onus of Proof

Another belief system which contributed to the culture of denial was about the onus of proof. The prevailing belief is that the mine was safe unless proved otherwise.

A high CO reading recorded by a supervisor was treated in the same way – doubted unless proved otherwise by the investigating party; whom couldn’t replicate the finding.

Alternative explanations were created to explain the high reading (other than a heating), e.g. faulty instrument or reading.

Several examples of alternative explanations were created and since “this explanation cannot be dismissed, this casts doubt on whether the findings can be used as evidence of a heating”.

A similar phenomenon existed prior to the Challenger launch. Quoting the paper, “Management asked for certainty that the O-rings would not work. The engineers could not provide this certainty. It was therefore presumed that the O-rings would work”. Hence, the launch during the abnormally cold conditions went ahead.

Hopkins asks the question: when doubt is raised about a decision that could kill people, who has the burden of proof? Is it those who raise the concern or those who manage the operations? “All too often, it seems, risky operations are assumed to be safe unless it can be proved otherwise”.

Putting the Indicators in Context

Another belief at Moura was how little reliance could be placed on any one indicator in isolation. Hence, any indicator of potential harm had to be considered within the context of the indicator.

E.g. “Indicators only took on meaning when viewed in their overall context”. Like, the smell of benzene several people observed (signs of a heating), were not seen as warning signs unless they were associated with another indicator (like high CO readings).

Likewise, high CO readings could be dismissed if it wasn’t coupled with an exponential rise.

Therefore, “a report of a smell only became an indicator of a heating if it was confirmed by other reports”. No single indicators in isolation were seen as a warning sign.

He says on the surface it might seem reasonable but was problematic. First if the indicator wasn’t coupled with another indicator then it could be “ignored”.

Second, if an indicator only becomes worrying if it depends on the context, then surely the appropriate response is to investigate the context. This could include checking how many other reports of smells have there been etc. This wasn’t asked.

Therefore, if context is critical, then fair enough, but then the natural response is to ensure that the context is established for the isolated indicators. But in the case of Moura, “to make no systematic effort to establish the context, nullified the effect of the warning”

Hopkins then discusses trigger action response plans, which I’ve skipped.

He also discusses the role of accountability – where “Decision makers must be accountable for their decisions”. One way to drive this is to require the decision makers to “commit to writing their decisions and reasons for them”.

Ref: Hopkins, A. (1999). Counteracting the cultural causes of disaster. Journal of Contingencies and Crisis Management7(3), 141-149.

This image has an empty alt attribute; its file name is buy-me-a-coffee-3.png

Shout me a coffee

Study link: https://doi.org/10.1111/1468-5973.00107

LinkedIn post: https://www.linkedin.com/pulse/counteracting-cultural-causes-disaster-ben-hutchinson-gg0kc

Leave a comment