Do Large Language Models Show Biases in Causal Learning? Insights from Contingency Judgment

This study found that LLMs inferred causality when no causal relationship existed in medical drug scenarios (‘illusion of causality’).

They created 1000 medical drug scenarios, using scenarios which had either real causal relations between real drugs and real conditions, with made-up drugs or conditions, e.g. “Drizzlemorn disorder”.

A causal illusion is something like “I take a pill. I happen to feel better. Therefore, it works”; which is said to stem from “simple intuitions based on coincidences”, and where recovery is just as likely where somebody has the medicine or not.

They tested: GPT-4o-Mini, Claude-3.5-Sonnet and Gemini-1.5-Pro (** so not the latest models as of now).

They found that “all evaluated models systematically inferred unwarranted causal relationships, revealing a strong susceptibility to the illusion of causality”.

They argue “While there is ongoing debate about whether LLMs genuinely “understand” causality or merely reproduce causal language without true comprehension, our findings support the latter hypothesis”.

And “These biases have important real-world implications, particularly in domains where precise causal inference is essential for informed decision-making”.

E.g., the LLMs replicate causal language patterns from the training data rather then have a genuine understanding of causality.

Several limitations were present of course, one being that they didn’t have human comparisons.

** I’m not sure the authors speculate about the susceptibility of newer models to the same causal illusions, but other research seems to support their findings.

Ref: Carro, M. V., Mester, D. A., Selasco, F. G., Marraffini, G. F. G., Leiva, M. A., Simari, G. I., & Martinez, M. V. (2025). arXiv preprint arXiv:2510.13985.

Study link: https://arxiv.org/abs/2510.13985

LinkedIn post: https://www.linkedin.com/posts/benhutchinson2_ai-llm-activity-7387186054555967488-ugLr?utm_source=share&utm_medium=member_desktop&rcm=ACoAAAeWwekBvsvDLB8o-zfeeLOQ66VbGXbOpJU

Leave a comment