A 2018 presentation from Sid Dekker on success and drift into failure. One of many such, and no particular reason why this one over any other.
Sid starts with an example he heard, supporting the bad apple thesis:
“just get rid of the nurses who make mistakes and all will be a lot safer”
“Now that embodies a very interesting model of what systems risky and safe … a bad apple theory. Get rid of the bad apple and your system will be fine again”
Then a range of other points from complex systems:
· “Rather than trying to control failure we need to get better at understanding how success is created”
· “oh we have we have zero days without errors and they celebrate the absence of negatives as if any of that has any predictive capacity for things going spectacularly wrong. This is a grand illusion … counting the absence of negatives is not a predictor of how things are gonna go wrong”
· “People love counting error-free days and it predicts nothing in fact it only sponsors the hiding of evidence of things going wrong … it creates a culture of risk secrecy and cultures of risk secrecy in any community very quickly become dumb cultures, they don’t allow the boss to hear bad news. They don’t allow themselves to learn”
· In the cases that went right versus the case that didn’t go right, the same factors were found: human errors, guidelines not followed, communication failures, miscalculations, procedural departures
· Some differences were found in the cases that went right, being: ability to say stop, past success not taken as guarantee, diversity of opinion/dissent, discussions of risk kept alive
· “people who didn’t take past success as a guarantee of future safety were the ones that belong to the twelve [of cases that went right] … an acknowledgment of contextual subtleties and nuances and messy details that differ just a little bit and not assuming that doing the same thing that you did yesterday will guarantee will lead to success today”
· “how do you keep a discussion about risk alive? You ask double-checking questions by acknowledging your own fallibility. You’re saying I’m not sure”
· “there’s always this gap between how some bureaucrats imagine the work to be done and possibly held to all the rules and how work actually gets done”
