This undertook a text analysis of 48 RCA investigations to explore how RCA methods facilitate the exploration of systematic patient safety improvements.
Notable challenges in the application of RCA was covered in the paper, including translation of RCA methods into practice. For healthcare one challenge is applying these techniques to complex social contexts (e.g. ‘wicked problems’), with some believing that RCA has “limited usefulness” in that context. Further it’s said that research is “united in arguing that the interconnectedness and complexity of healthcare is not acknowledged in RCA” (p109).
The text analysis drew on qualitative methods including principles from Diane Vaughan’s Challenger ethnography.
Results
Overall, text analysis suggested that RCA methods in this sample did not sufficiently identify latent organisational issues and causality was disconnected from the proposed solutions/corrective actions. Active errors by people dominated RCAs “due to a lack of systemic process of analysis”.
For the environment category, no RCAs referred to governance, legislation or political agendas as contributing factors/influences. Culture instead was a recurrent theme across 33% of reports.
Interestingly, references to culture were primarily directed towards teams [rather than references involving organisations, I guess]. Nevertheless, ideas of culture were said said to be “lost from the text and investigation then went on to introduce more and more new ideas as the analysis progressed” (p113), which ultimately led back to a focus onto the workforce/individuals.
Not surprisingly, references to human behaviour and practices (frequently framed as errors) largely didn’t explore these factors as latent conditions but instead “the intricacies and complexities of investigating [serious events] were one-dimensional and not systemically interconnected” (p113). When latent conditions were identified these came to be discounted and disappeared from RCAs as more active conditions were included.
In one example despite the RCA identifying active errors, the RCA concluded that no contributing healthcare system or process could be identified. In another example they note that some latent concerns disappeared or were diluted from RCA investigations “even when they appeared to have contributed directly to a patients’ death (p113).
For creating lessons learnt or recommendations, observed was no determination in how these were identified based on the RCA; nor did RCAs make a connection “between human error and how human factors connect within broader organisational contexts” (p113). RCAs were confounded by a lack of analytical structure regarding the identification of recommendations – consistent with other research on RCAs.
For the organisational category, “the complexity and volume of gaps and breaches within the organisation were not analysed by the RCA” (p113). It’s said that despite RCAs being poised for clarifying latent conditions, “the language from section to section in the RCA remained disjointed” (p113). Disjunction in RCA reports “provided an unscientific approach” (p113).
While workarounds were common, RCAs failed to identify organisational/latent factors surrounding them. This is despite 47% of RCA solutions recommending a change in policies or procedures to mitigate the risk factor – things which “are not sufficiently attentive, provide ineffective solutions and result in little change” (p114). Specifically, 22% of actions referred to a new procedural document – despite new documents appearing to have done little to improve safety relating to serious events.
For the individual category, it’s said that while active error/factors were recognised in the RCAs and blame was contraindicated under company policy, blame was still present in the text of every RCA. The identification of human actions or mistakes “were abundant in the RCA but were unrecognised in terms of contributing system(s) failure(s)” (p114). It’s noted that while RCAs attempted to reveal multiple layers of human performance, they failed to analyse them in relation to organisational factors.
In discussing findings, it’s said that the language in RCAs “becomes diluted as complex issues fade from the analysis and are discounted and ignored from organisational learning” (p114). Frequent use of language decoys in RCAs (a shift from one idea to another unrelated or disconnected to the initial idea) were common. These contributed to the creation of solutions with “little or no relevance to the situational description” (p114).
Further to language decoys, it’s said that language is fragmented and shifted in RCAs to the point that connections and links in and between failures aren’t visible in the text; rendering analyses inadequate to promote effective learning and leaving organisational factors hidden.
For discussing the focus on active/individual factors, it’s said blame is active in RCA and represented across the organisation. While “blame is not the intention of the RCA”, they argue “the structure and function of the RCA makes this unavoidable” (p114). Language shifts the focus to individuals and away from deeper system factors, where harm in the RCA is “conceived as a failure of multiple individuals” (p115).
Based on this research it’s said that in RCAs “language decoys circumvented latent system failures, root causes were avoided, and recommendations were arbitrarily applied” (p115). Further, “The fundamental problem is that a systems approach cannot be applied because the RCA does not address organisational rules that are overlooked, or why individual and recurring safety deviations that result in harm are disregarded” (p115).
Author: Karen Singh, 2018, Safety Science
Study link: https://doi.org/10.1016/j.ssci.2017.12.006
Link to the LinkedIn article: https://www.linkedin.com/feed/update/urn:li:ugcPost:6835674547048579072?updateEntityUrn=urn%3Ali%3Afs_feedUpdate%3A%28*%2Curn%3Ali%3AugcPost%3A6835674547048579072%29
One thought on “Lifting the lid on root cause analysis – A document analysis”