
More practical wisdom from the late, great Trevor Kletz. This explored a few elements of learning from experience – like the problems of just blaming people or changing procedures.
First he targets the focus on changing procedures instead of improving designs. First, our first step should be, wherever reasonably practicable, is to remove the hazard (elimination or inherently safer design). Like, can we use a safer material instead of a toxic one.
If the risk cannot be removed, then it should be reduced. This can include passive protective equipment, and other engineering means.
Next is the reliance of human action, like engaging protective equipment and more. This is, of course, often the least reliable. He lists human actions as last in the hierarchy, but “not intend[ed] to diminish its value”.
Humans are essential, and especially during unintended instances of variability and faults. However, “Safety by design should always be our aim but is often impossible”.
And while human actions and also behavioural approaches can result in considerable improvements, these techniques “has had little effect on process safety”.
Therefore, “Behavioral methods should not be used as an alternative to the improvement of plant design or methods of working when these are reasonably practicable”.
He provides an example of people falling down stairs. The inherently safer option is to avoid the use of stairs by building a single story building, or using a ramp (or a lift).
When not reasonably practicable, intermediate landings should be installed to limit the fall, and to avoid spiral designs. Things like carpet and soft materials can soften the impact during falls.
However, active options like the lift “is expensive and involves complex equipment that is liable to fail, expensive to maintain and easily neglected”.
Moreover, the “procedural solution is to instruct people to always use the handrails, never to run on the stairs, to keep then free from junk and so on”. And while these can be backed up with behavioural techniques, it is likely to be the least effective and most difficult to actually successfully implement.
In Kletz’s view, in some companies:
“the default action after an accident is to start at the wrong end of the list of alternatives and recommend a change in procedures or better observation of procedures, often without asking why the procedures were not followed”.
Were the procedures problematic because they were too complex, unclear, and rarely enforced? These questions aren’t asked. Hence, changing procedures is the quickest and cheapest option, but less effective.

In contrast, Kletz notes that while designers often consider inherently safer options, but the authors of investigations more often do not. This simple idea that investigations struggle to consider design improvements is rejected by some, expecting the idea needs to be more complicated.
Perhaps, also, “it goes against the traditional belief that accidents are someone’s fault and the job of the investigation is to find out who it was”.
He gives examples of design issues. I’ve skipped most in this article. Like with operators mixing up valves. Use of interlocks would be a more complicated and costly option, whereas rearranging the pipework so the valves in the same line were opposite each other. This is a better option but more difficult.
He argues that “we do not grudge spending money on complexity but are reluctant to spend it to achieve simplicity”.
In other examples, he says that the default approach is to first look for procedural changes, like with responding to alarms. He gives an example of simple colour coding interventions (like pipework or valves). He also gives an example of confusing colour coding, like with a roll-on, roll-off ferry. People were told to return to their vehicles via the blue stairs, and a person was injured when they confused the blue stairs with a turquoise stairs.
“Just tell people to follow the rules”
He still focuses on this logic to focus on rules and people rather than better designs. He uses examples pointing out how people can develop optimising behaviours to save time, and are matched by cues in the environment (but were actually incorrect).
One instance had an investigation report note the “failures to follow procedures: the people who drained and isolated the steam line did not inform those responsible for purging the instruments”.
The report also recommended that managers stress the importance of following procedures. However, “There was no suggestion … that the procedures might be improved, for example, by fitting a warning notice on lines that are out-of-use, or that the design could be improved by fitting a check valve in the steam line”.
Blaming the operator rather than the software
Next he points to examples where investigations blame people rather than exploring the role of the environment/systems.
An operator had to switch a spare transformer in place of the working one using a remote computer in the control room. He inadvertently isolated the working transformer. The investigation blamed operator distraction, and suggested “greater formality in preparing and following instructions when equipment is changed over”.
Kletz observes that the report didn’t call for better software controls. E.g. he says “it should be simple for the computer program, when the computer is asked to isolate a transformer to display a warning message such as, “Are you sure you want to shut down the electricity supply?”.
Why then do we get warnings from our word processors about deleting files but not from specialised software? Notably, “There is no need for control programs to be less user friendly than word processors”.
He gives other examples of the “default action” of investigations being the description of how human behaviour should have been different rather than “to look for ways of changing the behavior of the equipment”.
“Don’t assemble it wrongly”
Kletz continues on the limits of focusing on people over designing better human-centred environments and technology.
He says that designers in one investigation reported: ““There was nothing wrong with the design. The maintenance (or construction) team assembled it wrongly”.
While this may seem sound on the surface, importantly, “Equipment should be designed so that it cannot be assembled wrongly or at least so that it is obvious if it has been”.
Believed in one industry but not in another
He talks about how issues are not seen as relevant between different industries, even when they really should be [** Woods et al. called this phenomenon ‘distancing by differencing’].
He cites examples of flammable materials and ignition sources. Designers of one system foresaw the possibility of a flammable material entering an inert system, but “had assumed that it could not explode as there was no obvious source of ignition”.
Problematically, “It is now widely recognized in the oil and chemical industries that it is impossible to remove every source of ignition with 100% confidence, and therefore, we should prevent the formation of flammable mixtures of gas or vapor and air”.
He says [as of 2004 of course], that the Aviation industry hadn’t learnt the same lessons about the inevitability of ignition. He notes that vapour spaces on 747 wings are often located near heat sources, and “are flammable for more than one-third of the operating hours”.
As such, “sources of ignition can never be completely eliminated. It is hubris to imagine that we can infallibly prevent a thermodynamically favored event”.

In concluding, Kletz argues that those responsible for facilities, designs, management decisions etc. must get into the weeds – “It is not sufficient for them to take a helicopter view that shows only the forests. They should land the helicopter and look at some of the trees and even the twigs and leaves”.
Ref: Kletz, T. A. (2004). Learning from experience. Journal of hazardous materials, 115(1-3), 1-8.

Shout me a coffee (one-off or recurring monthly)
Study link: https://doi.org/10.1016/j.jhazmat.2004.05.007
LinkedIn post: https://www.linkedin.com/pulse/learning-from-experience-ben-hutchinson-s8j2c