ABSTRACT
Background
Electronic hospital variance reporting systems used to report near misses and adverse events are plagued by underreporting. The purpose of this study is to prospectively evaluate directly observed variances that occur in our pediatric operating room and to correlate these with the two established variance reporting systems in our hospital.
Materials and methods
Trained individuals directly observed pediatric perioperative patient care for 6 wk to identify near misses and adverse events. These direct observations were compared to the established handwritten perioperative variance cards and the electronic hospital variance reporting system. All observations were analyzed and categorized into an additional six safety domains and five variance categories. The chi-square test was used, and P -values < 0.05 were considered statistically significant.
Results
Out of 830 surgical cases, 211 were audited by the safety observers. During this period, 137 (64%) near misses were identified by direct observation, while 57 (7%) handwritten and 8 (1%) electronic variance were reported. Only 1 of 137 observed events was reported in the handwritten variance system. Five directly observed adverse events were not reported in either of the two variance reporting systems. Safety observers were more likely to recognize time-out and equipment variances ( P < 0.001). Both variance reporting systems and direct observation identified numerous policy and process issues.
Conclusions
Despite multiple reporting systems, near misses and adverse events remain underreported. Identifying near misses may help address system and process issues before an adverse event occurs. Efforts need to be made to lessen barriers to reporting in order to improve patient safety.
*********
From the full-text paper:
This compared reporting and underreporting of near miss events in a paediatric surgery from the same institution.
They contrasted three streams of data: direct work observations by trained observers, handwritten perioperative variance cards (which were cards implemented as part of a quality initiative that allowed staff to answer open-ended questions like “what happened” and placed in a collection box for analysis weekly by a staff committee), and via an electronic event reporting system.
A lot of the core findings are in the abstract above but some things I found notable were:
- From 830 surgical cases, 211 were attended by safety observers. 137 (64%) of near misses were identified by direct observation whereas 57 (7%) of the near misses were observed via the handwritten cards and just 8 (1%) were formally logged into the electronic system.
- Safety observers were more likely to recognise time-out and equipment variances. Indeed, when it came to safety observers, they found “9 times as many variances as the handwritten system and 65 times more than the electronic system” (p339).
- Indirect safety issues were the most likely domain identified by all three reporting systems, (issues which indirectly may affect patient safety outcomes) followed by safety process variations.
- Differences existed on what was reported between the three systems; namely electronic reporting was “only used to report knowledge/attitude and policies/process variances. Knowledge/attitude variances were the most commonly reported by the direct observers and the handwritten variance system, while polices/processes were most commonly reported by the electronic variance system” (p338).
- Little overlap between the variances reported between what the safety observers reported versus the handwritten cards and the electronic system.
- Only one event was reported by both the safety observers and the handwritten card, and only one event was reported via both the card and the electronic system. Safety observers reported a further five adverse events that were not captured by either of the other systems (one example was the use of latex in a patient who had a known latex allergy).
- Variances in the surgical safety checklist, equipment problems, medications and not following isolation protocols were more likely to be reported via direct observation and the handwritten cards.
They state that these findings show that “near misses remain underreported in our pediatric operating rooms despite the availability of a robust variance reporting system” (p339).
That is, “the overwhelming majority of adverse events and near misses are not disclosed” (p339).
Not surprisingly, the findings indicate that systems should rather be complimentary and to augment the capturing of intel. Although this data appears to show that near miss info is not being properly utilised, some have likewise “criticized that the common practice of reporting all issues that could have or did lead to patient harm leads to overreporting of unnecessary information” (p339).
Other data has shown that near miss and variance data is not likely to be reported when it doesn’t result in an adverse event (e.g. one study found 57% of physicians didn’t report a minor near miss event). Another study found 57% of residents didn’t report a near miss event when they experienced it. Other data found that hospital staff didn’t report 86% of adverse events and most often “due to misperception about what constituted a reportable event” (p340).
For barriers to reporting, they note based on prior research: (p340)
- Fear of personal or professional repercussions
- Lack of timely feedback
- Scepticism about the utility of reporting
- Cynicism that the system will change for the positive
- Competing demands for time and workload
Study link: https://doi.org/10.1016/j.jss.2017.08.005
Link to the LinkedIn article: https://www.linkedin.com/feed/update/urn:li:ugcPost:6945491480404717568?updateEntityUrn=urn%3Ali%3Afs_updateV2%3A%28urn%3Ali%3AugcPost%3A6945491480404717568%2CFEED_DETAIL%2CEMPTY%2CDEFAULT%2Cfalse%29
One thought on “Are we missing the near misses in the OR?—underreporting of safety incidents in pediatric surgery”