On folk models, ontological alchemy and other critical perspectives in risk

A few extracts with probably little to no links between them – but critical perspectives of techniques and their worldviews and applications.

Not systematic. Refs at bottom of article.

Dekker on ‘ontological alchemy’

Dekker argues that, just like alchemists tried to turn base metals into gold, practitioners and scholars perform ontological alchemy by trying to transmute qualitive, often intangible, aspects of human experience or organisational reality into quantifiable data.

That is, ‘figments’, like mental constructs, subjective perceptions or nuanced real-world experiences, are turned into ‘figures, like categories, or quantities with numbers, metrics.

For instance, rather than understanding trust as a multifaceted human phenomenon, it is reduced to a score on a questionnaire. Certain research or concepts can oversimplify complex human and organisational phenomena, relying instead on quantifiable metrics.

This process can also create an illusion of objectivity and scientific rigour, where something given a number the “analytic machinery stops”, and deeper inquiry into its nature, context, meaning etc. is neglected.

Ultimately, this process of transmutation can disconnect concepts from the real-world phenomena.

In essence, “ontological alchemy” is a critique of how certain research and management practices can oversimplify complex human and organizational phenomena by reducing them to easily quantifiable metrics, potentially losing crucial context and understanding in the process.

Downer on the principle of mechanical objectivity

Downer criticises an overly literal interpretation of data and ‘facts’. That is, complex technological properties of risk and reliability should be “ wholly, objectively, and quantitatively knowable through formal rules and unbending algorithms”.

One belief is that these systems, and particularly quantitative risk methods with ostensibly precise numbers and probabilities, provide an objective process ordered by formal rules and algorithms; purportedly allowing measurements and deductions grounded in “incontrovertible, reproducible, and value-free truths” (Downer, 2009, p. 9)

He calls this the principle of mechanical objectivity. In includes statements like “expert pronouncements about nuclear risk (“The math is the math”; “It’s not an opinion, it’s a calculation”).

He argues that these purely objective constructions of risk, which can be simply separated from subjective constructions, are inconsistent with wider bodies of research.

Downer, however, in another paper argues that even within the use of formal methods and logics, apparently objective ‘facts’ still require human judgement.

Hollnagel & Dekker on folk models

These authors argue that many operating ideas within the safety and psychology world rely on ‘folk models’. These are common-sense, intuitive and often unscientific explanations of human performance and error.

While the models are easy to grasp, they explain by substitution – e.g. instead of really explaining a phenomenon, folk models substitute one label for another. Like, substituting error for situational awareness, which still doesn’t explain attention, nor how awareness ‘was lost’.

Folk models are said to often be immune to falsification, and rely on overgeneralisation.

Folk models can create false impressions of how well understood a phenomenon is, when they’re more relabelled phenomema. They can also lead to superficial analyses, like in accident investigations, e.g. loss of situational awareness or lack of attention.

The authors advocate for focusing more on observable characteristics of human performance (noting that this paper was from 2004).

Probative blindness from Rae et al.

These authors propose that not all safety activities actively create greater ‘safety’ (what they called ensurance), nor provide greater subjective confidence about the degree of safety (assurance).

Probative blindness is a phenomenon where safety activities provide stakeholders with subjective confidence in safety that is disproportionate to the actual knowledge those activities provide about real world problems.

Probative blindness can lead the illusion of assurance – this safety work that we perform (audits, inspections, risk assessments etc.) create a feeling or belief of safety without genuinely improving our knowledge of the issue, nor the actual margins of safety.

Probative blindness can contribute to a drift of risk, separating beliefs of safety from the operational reality.

Tierney on the veneer of science in risk analysis

Similarly to Downer, Tierney was critical of the uncritical acceptance of ‘objective’  risk assessments, analysis etc.

That is, analytical and apparently objective data and techniques “merely add the veneer of science to what are essentially political and economic decisions”.

To believe that sociopolitical and subjective influences are entirely separate to objective facts and processes “is to reject an overwhelming body of historical and social-scientific evidence”.

This image has an empty alt attribute; its file name is buy-me-a-coffee-3.png

Shout me a coffee (one-off or monthly recurring)

Refs:

Dekker, S. W., & Nyce, J. M. (2015). From figments to figures: ontological alchemy in human factors research. Cognition, Technology & Work17, 185-187.

Dekker, S., & Hollnagel, E. (2004). Human factors and folk models. Cognition, Technology & Work6, 79-86.

Downer, J. (2014). Disowning Fukushima: Managing the credibility of nuclear reliability assessment in the wake of disaster. Regulation & Governance8(3), 287-309.

Rae, A. J., McDermid, J. A., Alexander, R. D., & Nicholson, M. (2014, October). Probative blindness: how safety activity can fail to update beliefs about safety. In 9th IET International Conference on System Safety and Cyber Security (2014) (pp. 1-3). Stevenage UK: IET.

Tierney, 2012. “L’analyse des risques et leurs dimensions sociales,” Télescope, Vol. 16, No. 2, 2012, pp. 93-114

LinkedIn post: https://www.linkedin.com/pulse/folk-models-ontological-alchemy-other-critical-risk-ben-hutchinson-mcclc

Leave a comment