
What is the role of frontline people within complex system failures? For James Reason, it was often just providing the local triggers to “manifest systemic weaknesses created by fallible decisions made earlier”.
This 1990 paper goes through his thinking of human performance in complex failure.
It was meant to be a small post, but couldn’t be bothered cutting it back, so here we are.
Extracts:
· While human actions are often evident post-disaster, “it is evident that these ‘front-line’ operators are rarely the principal instigators of system breakdown”
· Instead, they tend to provide the local triggering conditions to bring about latent systemic weaknesses
· “These accidents arose from the adverse conjunction of several diverse causal sequences, each necessary but none sufficient to breach the system’s defences by itself”
· While modern sophisticated engineered systems are resistant to single failures, which is an achievement, “it carries a penalty. The existence of elaborate ‘defences in depth’ renders the system opaque to those who control it”

· Cheap computing, layered defences etc. means “human operators are increasingly remote from the processes that they nominally govern. For much of the time, their task entails little more than monitoring the system to ensure that it functions within acceptable limits”
· Latent conditions “may lie dormant for a long time, only becoming evident when they combine with local triggering factors ( that is, active failures, technical faults, atypical system conditions, etc.) to breach the system’s defences”
· “They are most likely to be spawned by those whose activities are removed in both time and space from the direct human-machine interface: designers, high-level decision makers, regulators, managers and maintenance staff”
· “rather than being the main instigators of these disasters, those at the human-machine interface were the inheritors of system defects created by poor design, conflicting goals, defective organization and bad management decisions. Their part, in effect, was simply that of creating the conditions under which these latent failures could reveal themselves”
· “The more complex, interactive, tightly coupled and opaque the system (Perrow 1984), the greater will be the number of resident pathogens. However, it is likely that simpler systems will require fewer pathogens to bring about an accident as they have fewer defences”
· “The higher an individual’s position within an organization, the greater is his or her opportunity for generating pathogens”
· “It is virtually impossible to foresee all the local triggers, though some could and should be anticipated”, but resident pathogens can be identified
· Nevertheless, Reason recognises that “The resident pathogen metaphor is far from being a workable theory” (as of 1990)
Next Reason discusses the tensions between production and safety decisions:
· All organisations have a tension between production and safety – in the long term they are compatible, but short term can conflict
· “resources directed at improving productivity have relatively certain outcomes; those
· aimed at enhancing safety do not” and “This is due to the large part played by stochastic factors in accident causation”
· “the feedback generated by the pursuit of production goals is generally unambiguous, rapid, compelling and (when the news is good) highly reinforcing”
· “In sharp contrast, that derived from the pursuit of safety goals is largely negative, intermittent, often deceptive and perhaps only compelling after a major accident or a string of incidents”
· “Even when decision makers attend to this feedback, they do not always interpret it correctly. Defensive ‘filters’ may be interposed, which protect them from bad news and encourage extrapunitive reactions”
· Reason takes aim at the problems of defining and eliminating ‘unsafe acts’: “Although certain of these acts may fall into a recognizable category (for example, failing to wear personal safety equipment) and so be amenable to targeted safety programmes, the vast majority of them are unforeseeable and occasionally quite bizarre”
· “A significant number of accidents in complex systems arise from the deliberate or unwitting disabling of defences by operators in pursuit of what, at the time, seem to be sensible or necessary goals”, like with the Chernobyl plant test plan
Next he talks about the importance of an effective safety information system, found in some research to be the second most important part of a safety strategy which delineates ‘safe’ vs ‘unsafe’ companies (after management commitment):
· Loop 1, relies on reactive incident/injury data. Reason says “In most cases … the information supplied is too little and too late for effective proactive control. The events that safety management seeks to prevent have already occurred”
· Loop 2 is observed unsafe acts. He says this information, in practice, is “usually only disseminated to the lower, supervisory levels of the organization” rather than the higher-level decision makers
· Loop 3, loops 3 and 4 as of 1990 were said to be the “main thrust” of accident causation – focus on the precursors to unsafe acts and the management systems/structures etc.

· This is based on the logic of tracing upstream and eliminating/mitigating the precursors
· Reason talks about ‘general indicators’ of the system’s safety management, which includes the variety and sensitivity of feedback loops
· Another deals with “senior executives’ responses to safety-related data. No amount of feedback will enhance system safety if the information supplied is not acted upon in a timely and effective manner”
· Finally, Reason talks about HRO and its potential promise (again, as of 1990)
· He asks this: “Can we build these adaptive structural ingredients into high-risk organizations at their outset, or must they evolve painfully and serendipitously over many years of hazardous operating experience? It is probably too early to tell”

Ref: Reason, J. (1990). The contribution of latent human failures to the breakdown of complex systems. Philosophical Transactions of the Royal Society of London. B, Biological Sciences, 327(1241), 475-484.
LinkedIn post: https://www.linkedin.com/pulse/contribution-latent-human-failures-breakdown-complex-ben-hutchinson-znxsc