Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias

This paper discussed 6 fallacies of biases and 8 sources of bias that influence decision making and investigations.

The focus is more on forensics and the like, but still a useful read.

Note that bias isn’t an inherently bad or undesirable thing.

First, six fallacies of bias are presented:

For ethical issues, “Many incorrectly think these biases are an ethical issue of corrupt individuals or even malicious acts by unscrupulous people”, but in contrast cognitive bias “actually impacts honest and dedicated examiners”.

Bias is not a matter of dishonesty, personal character, or integrity, but of brain architecture and sensemaking, among other factors.

The second fallacy relates to ‘bad apples’. People are frequently blamed (either directly or indirectly) for errors, rather than systemic and design issues being recognised. They discuss that while some cases of error is related to competency, these tend to be easier to detect and remedy; whereas cognitive biases are widespread, “not due to incompetency” and are harder to detect.

The third fallacy is expert immunity, such that there “is a widely incorrect belief that experts are impartial and immune to biases”. Yet, nobody, not even experts, are immune from certain biases.

Moreover, expertise and experience can “make experts engage in more selective attention, use chunking and schemas (typical activities and their sequence), and rely on heuristics and expectations arising from past base rate experiences … which create a priori assumptions and expectations”.

The result is that in some cases, experts can often be more susceptible to bias than laypeople. They cite research highlighting how experts tend to be very confident in their knowledge or expertise but then “perform worse than novices”.

The fourth fallacy is technological protection, where people inadvertently believe that using technology, instrumentation, automation or machine learning etc. eliminates bias. However, even when technology is used “human biases are still at play because these systems are built, programmed, operated, or interpreted by humans”.

Technology can also lead to false senses of safety.

The fifth fallacy is a bias blind spot, where people aren’t aware they have the bias.

The six fallacy is an illusion of control. Here, even when experts are aware of their biases, they “nevertheless think they can overcome them by mere willpower”; resulting in an illusion of control.

Conversely, they note that “trying to deal with bias by the illusion of control may actually increase the bias, due to “ironic processing” or “ironic rebound”. That is, drawing attention to a bias may amplify the effects of the bias.

Next the authors discuss eight sources of bias, shown below. I’ve skipped heaps from this section of the paper.

Some forms of data can be biasing, like analysis of voice, handwriting, bitemarks and the like. Reference materials can bias how data are perceived and interpreted. Like DNA, fingerprinting and follow-on decisions can be interpreted to fit the data.

Hence, “Rather than the actual evidence being the driver of the decision making process—where evidence is interpreted based on the data it contains—the suspect’s profile is driving the decision making”. Instead of going from evidence to the suspect, or data to theory, reference materials cause a backward circular reasoning from the target back to the evidence.

Contextual information also biases, as experts are often exposed to irrelevant info which can influence their analytical processes. Biases can elevate the incorporation of task-irrelevant information to be overweighted, underweighted or neglected.

It’s explained that “expectation biases what and how information is represented and processed in the brain” and that these “biases impact experts and cannot be properly controlled by mere willpower”.

Another source of bias relates to the base rate. Experts work from previous experience, and this carries over to new cases. While this often helps with sensemaking, analysis of current situations can be skewed from prior experiences with little relevance.

Base rate biases can mean that infrequently identified factors in past experiences may mean those factors are then neglected in future cases. Another example is a hospital that receives far more cases of vaccinated COVID patients compared to unvaccinated, leading one to suggest that the vaccine is ineffective. However, far more people had the vaccine, so a greater number of vaccinated patients would be statistically expected; hence, ignoring the base rate in this case leads to a skewed and misinterpreted understanding of the issue.

A raft of organisational factors are sources of bias. In the forensic space, these have been documented in the interpretation of DNA evidence, where analysis is often undertaken within the “adversarial legal system”, leading to biases like an allegiance effect and myside bias.

That is, many forensic labs are said to be part of law enforcement agencies or departments of justices rather than independent private labs. Other factors relate to time pressure, expectations to reach certain results, stress, budgets, pressure to publish and more.

Many factors also relate to education and training – I’ve skipped most of this section.

Another source of bias are personal factors, like motivation, personal ideology and beliefs.

The final source of bias that this paper covered relates to human and cognitive factors. It’s said that “The workings of our brain create architectural and capacity constraints that do not allow it to process all the incoming information” and therefore the brain “engages in a variety of processes (mainly known as “top-down”) to make sense of the world and data around us”.

The mind “is not a camera” and the “active nature of human cognition means that we do not see the world “as it is.” A range of factors are implicated here, like social interactions, in-group biases, availability biases, processing fluency, and more.

They then discuss the roles of snowball and cascade biases. Bias doesn’t “impact only the individual in isolation or just one aspect of the work; often the bias cascades from one person to another, from one aspect of the work to another, influencing different elements of an investigation”.

Thus, biases can cascade and gather “momentum and snowball”.

The paper then briefly covers a bit around debiasing –some listed below: (p8003)

·         Use of blinding and masking technique to prevent exposure to irrelevant info

·         Use of methods like Linear Sequential Umasking, to control the sequence, timing and linearity of exposure to information to minimise “going backward” and being biased by reference materials

·         Use of case managers to help screen and control what information is given to whom and when

·         Using blind, double blind and proper verifications when possible

·         Have multiple hypotheses and alternative conclusions

·         Adopting differential diagnostic approaches, where different conclusions and their probability are presented

Author: Dror, I. E. (2020). Cognitive and human factors in expert decision making: six fallacies and the eight sources of bias. Analytical Chemistry, 92(12), 7998-8004

Study link: https://doi.org/10.1021/acs.analchem.0c00704

LinkedIn post: https://www.linkedin.com/pulse/cognitive-human-factors-expert-decision-making-six-eight-hutchinson

One thought on “Cognitive and Human Factors in Expert Decision Making: Six Fallacies and the Eight Sources of Bias

Leave a comment