
This 2006 paper from Sid Dekker critically challenges the assumptions underlying “error counting” in safety.
Some points: (Not a summary)
Error background
- Error counts by organisations are seen to have numerous benefits – providing “an immediate, numeric estimate of the probability of accidental death, injury or other undesirable event”
- However, categorising errors requires “a number of assumptions and take certain philosophical positions” and rarely are these made explicit in the method decription
- Error counting rests on a “naïvely realist idea that social phenomena (including errors) exist as facts outside of individual minds, open for objective scrutiny by anybody with an appropriate method”

- “ The reality of an observation is socially constructed. The error becomes true … only because a community of specialists have developed tools that would seem to make it appear, and have agreed on the language that makes it visible”
- “There is nothing inherently “true” about the error at all. Its meaning is merely enforced and handed down through systems of observer training, labeling and communication of the results, and industry acceptance and promotion”
- Even though errors appear real and factual to the observer, this isn’t necessarily so and “Facts privilege the ruling paradigm”
- “To the naïve realist, the argument that errors exist is not only natural and necessary. It is also quite impeccable. The idea that errors do not exist, in contrast, is unnatural. It is absurd”
- Here, a quote from Reason is provided, who took this view: “Indeed, there are some psychologists who would deny the existence of errors altogether. We will not pursue that doubtful line of argument here.” (Reason & Hobbs, 2003, p. 39)

Complex systems
- “The formal descriptions of work embodied in policies, procedures and regulations–and implicitely imposed through error counting–are incomplete as models of expertise and success”
- “In a world of finite resources, uncertainty and multiple conflicting goals, the knowledge base for generating safety in complex, risky operations is inherently and permanently imperfect (Rochlin, 1999), and no externally dictated logics of an error categorization system can arbitrate in any lasting way between what is safe or unsafe”
- “Where the creation of safety appears to have everything to do with people learning about, and adapting around, multiple goals, hazards and trade-offs (Rasmussen, 1997), deeper investigation of most stories of “error” show that failures represent breakdowns in adaptations directed at coping with such complexity”
- “Practitioners and organizations continually assess and revise their approaches to work in an attempt to remain sensitive to the possibility of failure”
- “Efforts to create safety, in other words, are ongoing”
- “Safety does not exist “out there”, independent of people’s minds or culture, ready to be measured by looking at behavior alone”
- “safe operations should consider safety as a dynamic, interactive, communicative act that is created as people conduct work, construct discourse and rationality around it, and gather experiences from it”
- Practitioners “actively engage operational and organizational conditions to intersubjectively construct their beliefs in the possibility of continued operational safety”
- “Safety is in some sense a story a group or organization tells about itself and its relation to its task environment”


Study link: https://citeseerx.ist.psu.edu/document?repid=rep1&type=pdf&doi=14d15a68409057ad5df9fccd960b47f57c69b911
My site with more reviews: https://safety177496371.wordpress.com
One thought on “Doctors are more dangerous than gun owners: a rejoinder to error counting”