Critical Thinking Checklist

This cheat sheet is condensed from "How Thinking Goes Wrong : Twenty-five Fallacies That Lead Us to Believe Weird Things," chapter 3 of Michael Shermer's excellent 1997 book, "Why people believe weird things : pseudoscience, superstition, and other confusions of our time."

It's a quick reference to be consulted whenever someone insists that they've acheived fusion by squeezing an ice cube, or that filling your pillow with daffodils will ward off evil spirits. When presented with such an extraordinary claim, it's best to start by remembering the following...

Hume's maxim

When someone insists that something miraculous is true, ask yourself whether it's more likely that the miracle has happened, or that the person has been deceived, is mistaken, or is lying. Go with the more probable explanation.

To help you evaluate the likelihood that the extraordinary claim might be true, run it through the following checklist of possible errors in thinking.

Problems in scientific thinking

  1. Theory influences observation – Werner Heisenberg: "What we observe is not nature itself, but nature exposed to our method of questioning." Columbus was certain he'd found asian plants in the West Indies because he believed he'd arrived in Asia.
  2. The observer changes the observed – you need double-blind studies.
  3. Equipment constructs results – you can't conclude that the sea has no fish shorter than 2 inches when you are using a net with 2-inch-wide holes.

Problems in pseudoscientific thinking

  1. Anecdotes do not make a science – you need studies with statistically significant numbers of observations, controls, and confirmation by other researchers.
  2. Scientific language does not make a science – talk about vibrations and energy levels doesn't mean anything unless the terms are precisely defined and represent measurable qualities.
  3. Bold statements do not make claims true – claiming that you've acheived cold fusion or can predict the future doesn't make it so – confirmation requires solid evidence.
  4. Heresy does not equal correctness – history forgets most of the scientists with outlandish claims because they turned out to be false, but the rare instances that were correct and are remembered give rise to this misperception.
  5. Burden of proof falls on the extraordinary claim – proponents of radical new hypotheses must prove that they are right, not demand that the establishment prove that they are wrong.
  6. Rumors do not equal reality – real evidence must be documented and verifiable by the person who collected it. "I read somewhere that..." is not a reliable source.
  7. Unexplained is not inexplicable –  just because you can't figure something out doesn't mean that it can't be explained. It's more reasonable to say "I don't know" and leave it at that than to begin hypothesizing unknown forces or supernatural intervention.
  8. Failures should not be rationalized away – negative results are important because that's how we correct scientific errors. Ignoring signals that your hypothesis is wrong removes the error-correction features of science and ensures that your results will confirm the hypothesis, no matter how wrong it is.
  9. After-the-fact reasoning – correllation does not imply causation. If the president scratches his nose and the stock market falls, that does not mean that presidential nose-scratching determines market results.
  10. Coincidence happens – probability inevitably leads to occasional unlikely events. You remember the rare time when you think of a friend just before they call you, and forget all the times you've thought of them and they didn't call.
  11. Representativeness – keep the larger context in mind – don't remember only the hits, while ignoring the misses. The Bermuda Triangle actually has a lower accident rate than surrounding waters, when you take into account the fact that more shipping lanes run through that area than the surrounding waters.

Logical problems in thinking

  1. Emotive words and false analogies – charged language, analogies, and metaphors do not constitute proof.
  2. Ad ignorantiam – inability to disprove a claim does not mean that it's true.
  3. Ad hominem and tu quoque – unpleasant people can still present a legitimate argument.
  4. Hasty generalization – a couple of observations does not establish a trend.
  5. Overreliance on authorities – experts are not always right, and amateurs are not always wrong.
  6. Either-or – distrust any argument that insists the answer must be either a or b – ask yourself if there might be a "neither" or "other" possibility.
  7. Circular reasoning – x is true because of y, y is true because of z, and z is true because of x.
  8. Reductio ad absurdum and the slippery slope – refuting an argument by carrying it to an extreme and absurd logical end, or constructing a scenario where one thing leads to another, with an end so extreme that it seems the first step should not be taken.

Psychological problems in thinking

  1. Effort inadequacies and the need for certainty, conrol, and simplicity – beware of neat, simple explanations, because things are usually complicated, messy, and uncertain.
  2. Problem-solving inadequacies – as when people:
    • Immediately form a hypothesis and look only for examples and evidence to confirm it
    • Don't change the hypothesis even when it is obviously wrong
    • Ignore complexity and form simplistic hypotheses or strategies
    • Form hypotheses about coincidental relationships between random observations
  3. Ideological immunity, or the Planck problem – well-educated people with well-founded theories can be very resistant to new ideas.