Overcoming Bias

  • Add to list Added to list Added
  • Download PDFopens in new window
  • Cite
  • Share
  • Embed

Overview

This skill explores some of the underlying ways in which our thinking tends to be flawed in different situations – and what it might mean to become more aware of, and mitigate against, these flaws. In particular, it looks at how some forms of research and investigation may oversimplify real-world situations and the ways in which bias can be not just a feature of our cognition, but also the product of structural problems with our attempts to measure, assess, and understand the world.

We’ll begin by looking at three factors that underlie many of these forms of bias and faulty assessment: attaching undue significance to random events or coincidences; failing to consider things that didn’t happen, but could have done; and assuming that reality is simpler and more predictable than is actually the case.

We’ll explore what it means to employ constructive doubt, and how this differs from both extreme relativism (declaring that all perspectives are equally valid) and cynicism (declaring that different claims and beliefs are made primarily for self-interested and thus unreliable reasons). We’ll look at the habits that can support constructive doubt – and reflect upon what it might mean to put these into practice.

The skill then looks in more depth at particular predictable species of error embodied in some research methods. The law of small numbers describes apparently noteworthy results that in fact result from nothing more than small sample sizes; while reversion to the mean describes the phenomenon of apparently significant changes being, in fact, due to nothing more than the tendency of an extreme event to be followed by a less-extreme event. For example, the worst-performing team in a league is inherently likely to do at least slightly better next season because it’s not possible for them to do worse.

Building on this, we’ll look at some of the general tendencies that underpin these and other analytical distortions. In particular, we’ll look at fundamental attribution errors – in which people seek simple, single causes or individuals to blame for events, while failing to give sufficient weight to context – and various forms of outcome and survivorship bias, in which failures and ‘alternative histories’ that might have plausibly happened are ignored, creating a false degree of confidence in the significance of how things happened to work out on one occasion.

This connects to another broader point, about hindsight bias and the dangers of overestimating the regularity and predictability of events – and thus of failing to make allowances for the degree of uncertainty and variability that many things show over time. In all of these cases, we’ll look at what it means to mitigate against this kind of flawed analysis – and what it might mean to put at the heart of your work a greater comfort with discomfort, awareness of complexity, and attentiveness to both the broader contexts and longer timescales that can reveal the limitations of shorter-term frames of reference.

Suggested Readings

  • Nassim Nicholas Taleb’s Fooled by Randomness (Random House, 2001) remains a classic, brilliant guide to our self-deceptions – and how to overcome them.
  • As the name suggests, the Overcoming Bias blog is a fascinating free resource devoted to pushing back against all kinds of bias https://www.overcomingbias.com/
  • Tim Harford’s How to Make the World Add Up (Bridge Street Press, 2021) offers a wise and highly entertaining guide to not being fooled by statistics.
  • Julia Galef’s podcast Rationally Thinking engagingly explores what it has meant for different thinkers to engage with their own uncertainties and limitations http://rationallyspeakingpodcast.org/
  • My own 2021 Guide to Clear Thinking for Psyche magazine sets out a five-part process for surfacing under-considered assumptions https://psyche.co/guides/how-to-think-clearly-to-improve-understanding-and-communication