Are audits effective checks and verifications of our risk control systems?
Are they diving deep into the functionality and effectiveness of systems and practices, and evaluating actual daily, hazardous work?
Or, are they mostly rustling paperwork at the expense of operational hazards?
Ref: Hutchinson, B., Dekker, S., & Rae, A. (2024). Audit masquerade: How audits provide comfort rather than treatment for serious safety problems. Safety science, 169, 106348.

Shout me a coffee (one-off or monthly recurring)
Transcription:
Imagine working somewhere you think has effective risk control processes, because the audit says so. You see the reports, the sign-offs, the procedures. But what if those audits are actually missing the most critical dangers? What if the story of auditing is how checking the boxes can tragically overshadow what truly keeps us safe, revealing a hidden disconnect between paperwork and reality.
G’day everyone. I’m Ben Hutchinson, and this is Safe As, a podcast dedicated to the thrifty analysis of safety, risk, and performance research. Visit safetyinsights.org for more research. So this is one of my papers co-authored with Drew Ray in Sydney Decker in 2024, titled Audit Masquerade, how audits provide comfort rather than treatment for serious safety problems, published in Safety Science.
Essentially, we unpacked 71 audit reports. I make sure of first, second, and third-party reports. From a large Australian design and engineering construction and maintenance company, over 16 separate and independent auditing firms were included in the data. And what we looked at was what were the audit reports looking at? So we evaluated the corrective actions and the observations from the reports. We wanted to know what the auditors look at, how do they justify what sort of evidence they use, all those sort of questions.
Finding background, well, many different types of audits exist, and they might have different goals and scopes. One line of argumentation is that audits shouldn’t be overly concentrated on compliance against the administrative aspects of the system and documentation. Instead, audits should prioritize addressing tangible and intangible factors linked to actual characteristics, states, influences, objects, or practices in the workplace. To enhance the efficiency and the safety management system in achieving the goals of the organization.
Hence, we argue that a core goal of audits really should be within safety critical context about minimizing decoupling. So decoupling is the distance between the intended purpose or function of something like a safety plan versus the actual function in practice of the artifact. So if we have a gas management plan in a mine, decoupling would be the actual management of gas isn’t being managed as expected by the plan.
Some criticism of audits is that they can focus on this thing called surface compliance, in the very superficial elements of their safety approaches. And audits may over prioritize the collection and review of documents. Authors Bluett and O’Keeffe, in their evaluation of industry auditing, detailed how paperwork was collected to create an auditable trail, where the quality of information within the documents was often a secondary consideration. Also, they observed a disparity in auditing between the paperwork that keeps people safe versus paperwork that helps complete audits, their words.
So let’s jump in, what were the results? Well, overall, we’re looking at the corrective actions and observations from these audit reports. We found most of the findings related to solving or resolving some sort of incomplete or missing documents. Resolving missing site signage, inspecting, placing or reviewing very sort of minor emergency equipment like fire extinguishers and first aid kits, to submit or display documents like posters or sending a register to somebody, or resolving incorrect version numbers on formatting.
We also observed that most of the findings were either only moderately or weakly connected to some sort of physical issue or hazard. In other words, only 16% of the findings were strongly connected to some sort of physical issue or hazard. There was virtually no examples of elimination or the higher order controls on the hierarchy. And virtually all of the corrective actions which we said there was a strong connection with a physical issue or hazard on site, focus on rectifying sort of immediate or incidental physical conditions like slips and trips.
What we did find though was audits with specific themes like electrical or traffic audits, tended to derive slightly better quality corrective actions or at least better quality in the sense that they were more directly connected to some sort of physical issue or physical hazard. And not surprisingly, most of the corrective actions that were rated as weak, weak as far as they had only a very indirect or weak connection to solving some sort of physical issue or hazard, actually all of them were administrative. So there was really wasn’t a clear link between the administrative action on solving some sort of tangible issue or hazard.
We also looked at audits that unpacked communication on site. And what it was found is that even audits that were geared towards auditing communication practices, they very rarely evaluated actual communication on site. At a total of 35 corrective actions or observations focused on communication, only seven of them commented on either the content of the communication, the effectiveness of the communication or the quality and retention of the communicate information.
Hence, most audit findings focus solely on the outputs of communication, like a signed toolbox talk or pre-start or pre-start attendance sheets, or that the content was displayed on a notice board. So basically all of them didn’t assess the quality or content of the communication. It suggests that audits might prioritize verification of a hud of facts documents over assessing the quality or content of information.
Also in this sample, we found that actions of communication were used to address issues that really didn’t relate to communication. For instance, gaps in risk assessments were addressed via a toolbox talk rather than addressing the risk assessment itself.
So do audits effectively rein in this decoupling effect? Well, based on these findings, no, not really. The audits rarely dug beneath very superficial matters of documentation and system administration. Even when auditing did focus on operational work and operational hazards, the emphasis really shifted to the remediation of very trivial or incidental things, like the slips and trips and cables and…
So it really appears that audits might optically take a find and fix approach, focus on immediate side issues rather than a deeper systematic focus on learning and improvement. Also, the results suggest that there may be a type of masquerade, a symbolic activity configured for surface documents and practices, and is unable or unwilling to ask substantive and critical questions about risk and work. Again, only 16% of the 327 corrective actions that were assigned to really improve health and safety management were only linked to a physical source of harm or a hazard in 16% of all the actions. With the remaining corrective actions, really just weak or moderately connected to physical issues.
So in discussing the findings, we pointed out that audits in this sample focused on surface compliance, and there was two variations of surface compliance. One is we called an illusion of depth. Audits were found to address immediate side issues if we’re implementing that missing lifting equipment register without calling for the investigation of the underlying causal factors. So why did that issue exist in the first place? It didn’t seem to ever get asked.
Additionally, no instances were observed of corrective actions directing the auditees to systematically rectify a family of issues. So how does the whole business learn? We also proposed that artifacts take on the guise of the issue. So audits frequently focused on revising some sort of document without accurately addressing the underlying physical issues. So solving the paperwork was conflated with solving the actual issue.
We also found that audits couldn’t really identify particular issues. Couldn’t audit. There was a lack of connection between requirements and specifications. So audits were found to focus a lot on minor specifications, like signatures, templates, version numbers, but weren’t so well-configured to focus on the requirements. You know, what’s the actual purpose of that process? What is it trying to achieve?
But to be clear, we did find benefits of auditing. It doesn’t mean audits didn’t have some sort of value or purpose. Audits were demonstrably effective at identifying readily observable psych conditions and lower tier hazards. They were also effective at verifying the presence of documented systems and deliverables. And they likely have other benefits that we couldn’t measure in this study.
So overall, this evidence highlights that audits were focused largely on evaluating and modifying documents, conflating the presence of a document as evidence of the effectiveness of the process, even though they’re not the same thing. So audits focused on surface compliance activities and lack to focus on critical components and critical risks. These findings suggest that some auditing approaches may be at risk of tweaking documents or addressing insignificant physical issues at the expense of properly addressing critical issues or hazards.
So audits could be a false veneer that the issues are being properly managed. The audit masquerade. What can we make of these findings? There’s way too much to go through here. But some simple, clear reflections for me is being clear about what we expect from audits in the first place. For instance, is that audit, that particular type of audit, purely a desktop documentation check? If it is, there’s likely to be no real misalignment with goals. It’s probably doing what we expect it to do. But if you ask 20 of your leaders in the business what they think audits are doing, I bet someone’s gonna say they think the audits are verifying the effectiveness of our risk systems. If that’s the case, then that needs to be firmly investigated and challenged because your audits may not be doing that.
Now, of course, there’s limitations, some of them, even though we did have 16 separate and independent audit companies as part of the data source, generalisability still has to be considered. Also importantly, it’s based on document analysis by a person who wasn’t present for those audits. So we don’t have all of the context discussed during the audit.
That’s it on Safe As. I’m Ben Hutchinson. If you found this useful, then please share, rate and review, and check out safetyinsights.org for more research.