
Some wisdom from Perrow’s awesome Normal accidents.
This book is full of nuggets, so just some random extracts:
· “But if … the operator is confronted by unexpected and usually mysterious interactions among failures, saying that he or she should have zigged instead of zagged is possible only after the fact”
· “Before the accident no one could know what was going on and what should have been done. Sometimes the errors are bizarre” like “non-collision course collisions”
· “careful inquiry suggests that the mariners had quite reasonable explanations for their actions; it is just that the interaction of small failures led them to construct quite erroneous worlds in their minds, and in this case these conflicting images led to collision”
· “Another ramification is that great events have small beginnings .. accidents that start with trivial kitchen mishaps” and “Small failures abound in big systems; accidents are not often caused by massive pipe breaks, wings coming off, or motors running amok”

· “Patient accident reconstruction reveals the banality and triviality behind most catastrophes”
· “Why do we, as drivers, or deck officers on ships, zig when we should have zagged, even when we are attentive and can see?”
· “we construct an expected world because we can’t handle the complexity of the present one. and then process the information that fits the expected world, and find reasons to exclude the information that might contradict it”
· “Unexpected or unlikely interactions are ignored when we make our construction”
· “These systems are particularly vulnerable to small failures that “propagate” unexpectedly, because of complexity and tight coupling”

· “High-risk systems have a double penalty: because normal accidents stem from the mysterious interaction of failures, those closest to the system, the operators, have to be able to take independent and sometimes quite creative action”
· “But because these systems are so tightly coupled, control of operators must be centralized because there is little time to check everything out and be aware of what another part of the system is doing”
· “Time and time again warnings are ignored, unnecessary risks taken, sloppy work done, deception and downright lying practiced” however “it occurs in all organizations, and it is a part of the human condition”
· “Fixes, including safety devices, sometimes create new accidents, and quite often merely allow those in charge to run the system faster, or in worse weather, or with bigger explosives”
· “Some technological fixes are error-reducing”, like jet engines and more, but other technology “are excuses for poor organization or an attempt to compensate for poor system design”
· “When we add complexity and coupling to catastrophe, we have something that is fairly new in the world”

Ref: Perrow C (1999). Normal Accidents. Princeton University Press
