Do leading indicators work as expected?
This scoping study evaluated 48 studies to explore the question.
(Note: PDF shared under the CC BY 4.0 open access licence)
Extracts:
· While most studies reported some positive impact of leading indicators on lagging indicator performance, “overall the evidence base was weak”
· Interestingly, it appears that most of the research has evaluated leading indicator performance versus lagging, rather than measuring the improvement of inputs
· Main limitations were that “(1) study designs were not appropriate for determining causality; (2) internal validity of studies was moderate to low; (3) studies were poorly generalizable. The biggest challenge was the inability to compare findings across studies”
· Between 1 – 11 lagging indicators were measured in each study to evaluate leading performance, including “injuries (n =25), accidents (n = 18), incidents (n = 14), near misses (n = 8), lost time (n = 6), fatalities (n = 5), compensation claims and costs (n = 4), sickness/illness (n = 3), safety (n = 1), or ‘other’ (n = 2)”
· “Twenty seven of these studies provided statistically significant findings demonstrating the favorable impact of leading indicators. However, the remaining 20 studies did not clearly report appropriate methods or statistical significance testing; it was unclear whether the findings of these studies reflected real effects or were due to chance”
PS. Check out my new YouTube channel: https://www.youtube.com/@Safe_As_Pod

Shout me a coffee (one-off or monthly recurring)