Safe As podcast e18: When emotion leads risk – risk as feelings and not just numbers

Risk in safety is often framed in matrices as likelihood x consequences. It holds an allure of (semi)objectivity – the numbers are the numbers.

But what is the role of emotion and feelings within our risk judgements? Today’s article argues that what we ‘feel’ about risk precedes and influences what we ‘think’ about risk.

This pod’s article is: Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2013). Risk as analysis and risk as feelings: Some thoughts about affect, reason, risk and rationality. In The feeling of risk (pp. 21-36). Routledge.

Make sure to subscribe to Safe As on Spotify/Apple, and if you find it useful then please help share the news, and leave a rating and review on your podcast app.

I also have a Safe As LinkedIn group if you want to stay up to date on releases: https://www.linkedin.com/groups/14717868/?lipi=urn%3Ali%3Apage%3Ad_flagship3_detail_base%3Bhdg8uJYYT%2BmsMqZvpHBmdQ%3D%3D

This image has an empty alt attribute; its file name is buy-me-a-coffee-3.png

Shout me a coffee (one-off or monthly recurring)

Spotify: https://open.spotify.com/episode/3zh7GSkKzdb3x2O9a1cFxT?si=ypog0liaSL65ZEX2cKKugQ

Apple: https://podcasts.apple.com/us/podcast/e18-when-emotion-leads-risk-risk-as-feelings-and-not/id1819811788?i=1000720902821

Transcript:

Have you ever made a decision where the numbers screamed one thing but your gut told you something else entirely? Maybe you’re more terrified of flying than driving, even though statistics consistently show driving is far riskier. Perhaps you felt an inexplicable dread about something that, on paper, poses little threat.

We’re constantly bombarded with definitions of risk, probabilities, data, objective assessments. We’re told to be rational, to trust the numbers, to leave emotion out of it when it comes to understanding danger. But that uneasy feeling in your stomach, that sudden surge of fear, isn’t just a rational noise. What if it’s a primal, lightning-fast intelligence, honed by millennia of human survival, telling you something critical about the world around you?

Good day everyone, I’m Ben Hutchinson, and this is Safe As, a podcast dedicated to the thrifty analysis of safety, risk and performance research. Visit safetyinsights.org for more research.

This paper today, “Risk As Analysis and Risk As Feelings – Some Thoughts About Affect, Reason, Risk and Rationality” by Paul Slovic and his colleagues explores how we actually understand and react to risk, published in 2004 in Risk Analysis. It argues that emotion, or affect—that’s A-F-F-E-C-T, not E-F-F-E-C-T—plays a crucial role in our decisions. The paper challenges the traditional idea that emotions make us irrational, instead suggesting that truly smart decisions keep both our logical and emotional thinking working together.

Now this paper covers a lot of ground, but fundamentally, we confront risk in three ways: Risk as feelings. These are our fast, instinctive, intuitive reactions to danger. Risk as analysis, which brings logic, reason, and scientific deliberation to bear on hazard management. And when these ancient instincts conflict with our modern, structured, analytical frames, we become painfully aware of a third reality, Paul Slovic argues in another paper, risk as politics.

So some of the key findings in this paper: really there’s two simple ways we can think about how people actually think and process the world. We have an analytical type of processing. This uses logic, numbers, and formal rules. It’s slow, it takes effort, and we’re consciously aware of it. It’s driven by reason and needs evidence to justify conclusions. We also have an experiential processing mode. It’s intuitive, fast, and mostly automatic. It relies on images and associations directly linked to emotions and feelings. This processing type represents risk as a feeling that tells us whether it’s safe to walk down this dark street or drink this strange-smelling water. It’s holistic, it’s guided by gut feelings from past experiences, and enables immediate action. This processing type was vital for human survival and remains our most natural way to respond to risk.

So these processing types work side-by-side and depend on each other. Analytical reasoning isn’t more effective unless it’s guided by emotion. Now just to update around, you’ve probably heard about System One and System Two thinking, popularized by Kahneman and others. Actually more recent thinking, based on a larger body of evidence, suggests there probably isn’t really a System One and System Two. Rather, it’s probably more accurate to call them “types.” They’re not distinct systems, but they’re types of operating in the brain. So I’m going to stick with types.

So, the role of affect and the affect heuristic: Affect is a faint whisper of emotion. It’s an automatic feeling of goodness or badness about something—how we emotionally respond and feel about it. These are affective reactions happening quickly and automatically, helping us to navigate a complex world. So the affect heuristic describes how we rely on these feelings as a mental shortcut. Using an overall emotional impression is far easier and faster than weighing pros and cons analytically, especially when decisions are complex or we’re short on mental resources or time.

How does affect—these feelings—influence our judgments? Some things that really heavily influence our judgments include dread and outrage. People’s feelings of dread, or “dread risk,” about certain risks like nuclear power heavily influence their perception and acceptance, often differing significantly from expert assessments. This is linked to factors like voluntariness or controllability. The less control people believe they have over something or are being exposed to something, generally the poorer they feel about it.

And while risk and benefit often go together in reality, people tend to judge them as inversely related. If we feel good about an activity, we judge its risk as low and its benefits as high. For instance, we typically feel good about the freedom that driving provides us, and we feel in control of those decisions to drive, so we rate the risk as lower. If we feel bad about something, we judge high risk and low benefit. This happens even faster under time pressure. For instance, people judge “good” stocks as high return, low risk, and “bad” stocks as low return, high risk, simply based on their overall feeling.

And the feelings towards risk can be disconnected from the statistical risk. For example, we fear shark attacks, but not the drive to the beach. And importantly, emotion precedes analytical judgments—how we feel about something shapes what we think about something.

Looking at probability and frequencies, there’s this idea called “imagining the numerator.” When given a choice, people sometimes choose a bowl with a higher number of winning items—for instance, like seven in a hundred—over one with better odds, one in ten, because seeing more winning items felt like a better choice. And also, how we frame the risk matters. So if we frame it as “10 out of every 100 patients,” it can make it sound more dangerous than a “10% chance,” because the frequency creates more vivid, frightening images. So importantly, vivid, emotionally charged warnings are more effective than just statistics. Another example is, let’s say a doctor framing the risk of surgery: “One death in 100 surgeries” sounds a bit scarier than “99 out of 100 successful surgeries.”

Also, when we’re evaluating interventions like in healthcare and medicine, people respond oftentimes more strongly to the proportion of lives saved (for instance, “this intervention will save 98% of their participants”) than the actual number (“150 lives were saved”), because proportions carry clearer emotional meaning.

People can also be really insensitive to probabilities. So for outcomes with strong emotional meaning, like a lottery jackpot or cancer, changes in probability often have little impact on our feelings. So the way that we feel about a lottery, for instance, whether the chance is 1 in 10 million or 1 in 10,000, we tend to feel the same way about it. This helps explain why people gamble and buy insurance simultaneously, or why fears about hazards like nuclear power persist, despite extremely low probabilities of major failures. Though it still has happened, of course.

How can these feelings go wrong, so failures of the experiential system? While usually helpful, sometimes they can lead us a little bit astray. So for instance, manipulation. Our emotional reaction can be exploited by advertising or propaganda. Cults, for instance, play on this very well. There’s also inherent biases and noise. So like something called psychophysical numbing: we’re insensitive to small changes near zero (so for instance, zero to one deaths), but struggle to emotionally grasp larger changes further from zero (500 to 600 deaths). As a 1947 Washington Post article about Stalin depressingly quipped, “If only one man dies of hunger, that is a tragedy. If millions die, that’s only statistics.” Our risk judgments are also sensitive to visceral reactions. Strong feelings in the present like hunger or cravings are hard to accurately recall or anticipate in the future.

So as I said, I’ve covered a lot of ground there. So what can we make of these findings, being very simplistic about a very large, broad body of research? So it’s a bit more difficult to come up with some practical implications, but how about some of these? And some of these came from the paper.

One suggestion is using reason to calm strong emotions. So when our “risk as feeling” processing type overweighs the frightening but remote consequences like terrorism, our analytical mind can sometimes be really effective to help provide perspective on the actual likelihoods. Not always. Sometimes you need to fight emotion with emotion. They give an example in the paper: fear might lead someone to buy a handgun, but analysis shows it’s far more likely to harm their own family than an intruder.

Another suggestion is adding doses of feeling to logic. Even highly analytical tasks, like proving a theorem or playing chess, benefit from intuition—like a mathematician sensing if a proof “sort of looks good,” or a chess master feeling a move “feels right.” So risk analysis also needs to consider softer values like dread or fairness that drive public concerns, and sort of counteract this psychophysical numbing and make statistics feel real. Effective communication can use vivid personal stories or physical representations. Another example from the paper was piling 38,000 pairs of shoes to represent handgun deaths.

So in conclusion, meaning is deeply tied to our emotions. Numbers and statistics about risk, unless they’re infused with affect (feelings), may not be truly representative of risk or how we perceive risk

Leave a comment