This systematic review evaluated 34 papers, exploring normalisation of deviance (NoD) in the context of its key themes and interactions.
NoD was first described by Diane Vaughan in her work evaluating the 1986 NASA Challenger space shuttle accident. As per Vaughan’s definition, NoD describes “the gradual process through which unacceptable practice or standards become acceptable. As the deviant behaviour is repeated without catastrophic results, it becomes the social norm for the organisation”.
Note that this doesn’t really refer to people maliciously breaking rules but rather a social process whereby “small changes—new behaviors that were slight deviations from the normal course of events—gradually become the norm, providing a basis for accepting additional deviance” and further, potential signals of danger are incrementally and gradually normalised, since they rarely lead to overt failures and largely under-appreciated since they drip-feed through rather than hitting all at once. These changes “became imperceptible to many within the organisation itself” (p6).
I highlight the above to avoid the attributions that it was lazy or immoral leaders, poor resourcing, lack of an effective culture of safety and any of these other myopic ideas; Vaughan went to some lengths to try to dispel some of these ideas and paint a more nuanced narrative (e.g. NoD, culture of production, production of culture, structural secrecy).
It’s 45 pages, so I had to skip a lot of the findings and just pick a few points.
Results
Across the studies, NoD themes & interactions were divided across risk normalisation, production pressure, culture, and a lack of negative consequences; a network of organisational, social and technical processes contribute to the phenomenon. Each factor shown below is discussed separately.
Note I’ve skipped most of the stuff in each of these categories.
1. Risk normalisation:
A number of studies explored how people become desensitised to risks in their environment. In France, risks associated with nuclear energy were noted to have been normalised in French media following the 2011 Fukushima accident.
Normalisation of risk “may be largely seen as an adaptive response, facilitating functionality in the presence of circumstances outside one’s control” (p24).
They cite data from industrial settings which highlight some organisational preconditions that facilitate risk normalisation. As an example, they note “operators are often assigned high levels of personal responsibility despite possessing low levels of actual control over their environments and performance of tasks.” (p24).
Other things they discuss:
- Risk normalisation is a core feature of NoD, but adaptations may occur in the absence of risk normalisation. Nevertheless, behaviours are unlikely to be repeated if the associated risk is perceived to be high and thus, it’s more likely than not to be associated with NoD. Risk normalisation requires people to develop an increased threshold, whereby they lose an acuity to perceived vulnerabilities.
- Periods of perceived successes or absent negative events may exacerbate a loss of perceived vulnerability to failure by increasing confidence disproportionate to risk.
- The introduction of new protections or barriers is also said to affect perceptions of risk whereby new protections “generally increases perceived safety, which may unwittingly encourage employee perceptions of system invulnerability” (p25). Viewing protections as “solutions rather than fail-safes to known problems” may incentivise optimisations to efficiency over thoroughness.
2.Organisational Factors:
Production pressure –
Production pressure was frequently discussed as a contributor to NoD. It’s commonly cited as a moderating variable in NoD that guides the direction and magnitude of deviations.
A ”cycle of failure” is said to be propagated, where risks are normalised and people re-evaluate their costlier and more conversative thresholds to match the perceived current conditions. The shift is said to move towards greater efficiency.
Evidence was cited from rail where continuous pushes for punctuality and on-time performance increases pressure for train operators to maintain schedule via efficiency optimisations. One specific change noted is “driver interpretation of signals has shifted over time in order to facilitate faster train movement”, resulting in higher signals passed at danger.
As opposed to some other industry data on motivations for process departures is that in many instances there is a push to minimise harm (rather than myopically trying to increase output and profits). An example is healthcare, where departures are often justified on the grounds of minimising patient discomfort and eliminate counterproductive measures. These are seen as part of the greater good.
Another effect called an ‘melioration bias’, which results in a tendency to prioritise options which maximise short-term gains over long-term ones, has seen other changes in operating procedures. An example were firefighters who didn’t don all required equipment as required since it hinders movements and impedes life-saving action.
3. Procedure/Environment Design
Inappropriate implementation of procedure & organisational design are factors in the initiation & maintenance of NoD.
Poor workplace design & resources, process etc. provides a justified means for people to depart established rules; some arguing that following all rules makes it nigh impossible to complete work as expected.
Moreover, it’s argued that “the very presence of deviance inherently signals towards potential flaws within a system’s environment or work process” (p29).
Examples from healthcare were shown. E.g. poor placement of hand hygiene stations decreases hand washing compliance, or malfunctioning barcode scanners disrupting the entire workflow and prompting the skipping of the scanning process.
In other workplaces, workarounds are informally accepted as the expected practice to fill in the gaps since the organisation doesn’t invest in amending poor process or design. One example from transport industry is that in some journeys, the operators just learn to drive when the system is in a continuously alarmed state; blunting clear signs of cautionary signals.
4. Leadership
Several points about leadership are covered. One is that leaders are instrumental in setting expectations and resources within the organisation. Their decisions, indecisions, actions, inactions, language etc. play a role in facilitating Nod, with downstream consequences.
Mostly known issues are covered in this section so I’ve skipped most of it. However, they highlight how leaders reinforce a production over safety mindset (and may not even realise they have this effect). Consequently, leaders may “[result] in the dismissal of warning signs and the encouragement of workarounds in the interest of production” (p30).
5. Cultures
Just a few points here. One is that although these facets (leadership, environmental design, production pressure, culture) are covered separately, many have elements that are overlapping, interacting and inseparable. They discuss culture in this context, highlighting how the social norms, social image, values and more of firefighters can result in an encouragement of behaviours with more risk acceptance; but at the same time, also promotes trust, courage and concern for others.
Cultural concepts, including an organisation’s history, externally projected image, the working environment, social identities, values, norms etc. may both foster adaptive and maladaptive practices & beliefs.
6. Lack of negative consequences
Absence of negative consequences is frequently covered in NoD literature. When departures fail to result in obvious adverse outcomes, it may be considered evidence that initial standards are overconservative. Thus, “This perception justifies deviations as acceptable evolutions of the productive process, wherein behaviour is merely adapting to maximise efficiency. In the absence of consequence such behaviours are prone to repetition” (p33).
7. Pre-Emptive Response
Pre-emptive response are the measures taken to anticipate, identify and prevent the propagation of maladaptive practices. It includes proactive measures, including near-misses & other signals, and retrospective learning activities.
Issues here relate to
- A bias to discount originally proposed risks where there’s a lack of incentive for reporting them
- Subjective and vague criteria on what to report
- Inappropriate or cumbersome reporting systems and processes; making it difficult to do the right thing
- Blame that can eventually result when things are reported
Discussion
Authors then discuss the paper findings. A couple of points is that, based on their conceptual model, “the prevention of harmful NoD may need to focus on the initial normalisation of risk; more specifically, ensuring that operator perceptions of risk do not degrade over time” (p35).
It also highlights the many organisational factors which contribute to NoD, such as how production pressure encourages workarounds or how various (sub)cultures, social norms and structural conditions alter work, perceptions and the discouragement of reporting events.
They note that based on their systems view of NoD, it places an “emphasis on understanding the impact of latent failures, framing active failures as biproducts of a flawed system rather than vice-versa” (p27).
Authors: N Sedlar, A Irwin, D Martins, R Roberts – 2021, psyarxiv
Study link: https://doi.org/10.31234/osf.io/tqphk
Link to the LinkedIn article: A Qualitative Systematic Review on the Application of the Normalisation of Deviance Phenomenon Within High-Risk Industries | LinkedIn