12 August 2023

Scientific thinking as the antidote to intuition

As I work on a book that will claim that evolution is easy, I have a parallel task of exploring the reasons we sense that it is hard or even impossible. Some of those influences are the result of efforts by religions to maintain dependence on supernaturalism or to defend ancient sacred writings. Some are the result of antipathy to science itself, framed in terms of culture war. But others are less clearly related—at least directly—to religions or tribes. Our brains are wondrous indeed but are known to be prone to various kinds of error. To be brutally frank: there are things that can seem obvious to us but that are false.

Daniel Kahneman's 2011 book Thinking, Fast and Slow was a life-changer for me. As soon as I read it in 2013, I urged colleagues to read it, even convening a book club at work. (The job of a journal editor is fundamentally about making decisions and judgments, and that's what the book is about.) One of the key messages of the book is that our fast thinking system (Kahneman calls it System 1) is both speedy and utterly important for survival. It's not about reflexes—it's still a kind of thought. But it's quick and dirty, often making guesses or approximations, and is prone to error. "Intuition" is a function of System 1.

Now let's back up. Because it's not adequate to merely say that System 1 is prone to error. You might get the impression that it has a particular error rate (which is true) such that it sometimes misses threats or inaccurately identifies things. But the problem is bigger and more dangerous than that. The system is prone to specific kinds of error. It's prone to systematic mistakes that are built into how it works in the first place. Thinking, Fast and Slow explores these system-wide problems, and I found them very sobering. Look up "framing" for an interesting example.

Now consider "intuition" in the context of scientific questions. We know that intuition renders quick judgments, and we also know it is vulnerable to repeated and systematic error. We can run experiments on ourselves to see this. I suggest this one: revisit the time when you were learning that the Earth is a ball. I can remember this pretty clearly. What I remember was a struggle between "knowing" that it must be true that the Earth is a spinning ball, and simultaneously "knowing" that it was flat and didn't feel like it was moving at all. I "knew" that the people on the other side of the ball didn't fall off or feel like they were upside down, but I also knew what it was like to be upside down and my brain had no system to jump in and help me understand how people on the opposite side of the Earth didn't know they were upside down. To this day, I find it unintuitive to consider trees and water and people on the opposite side of our terrestrial ball. My intuition is a permanent hindrance.

What can we do about this? I would say this is an urgent question in a world awash in misinformation that is often crafted to exploit intuition or cognitive biases that characterize System 1. I suggest that science is the answer. That sounds trite but hear me out.

Here's my claim: thinking scientifically is a discipline that seeks to intentionally negate intuition.

Some things I'm not claiming:

  • That science and intuition always disagree;
  • That intuition never tracks with scientific reasoning;
  • That scientific thinking always works;
  • That scientific thinking is always the best approach to every problem.

But I think that when we consider the depth and reach of science denialism and its bases in human cognitive structure/habits, one major influence we confront is something that is often explicitly described as "intuition." It travels under other names, most notably common sense and often "instinct." We see it when we hear earnest talk about "other ways of knowing" and we are in its mystical presence when someone hears from gods or spirits.

But intuition is a crappy tool for understanding the world. System 1 wasn't built for that purpose. It was built for speed. Intuition can't understand a terrestrial ball. It doesn't expect a caterpillar to turn into a butterfly. It can't help you understand how ice can fall from the sky an hour after the temperature peaked (in Tucson) at 105 F (40 C). It has never seen continents move.

In parallel, Kahneman explains the second system, System 2:

System 2 allocates attention to the effortful mental activities that demand it, including complex computations. The operations of System 2 are often associated with the subjective experience of agency, choice, and concentration.

Thinking, Fast and Slow, p. 21

System 2 kicks in when it's time to stop and think. When it's time to deliberate. But something has to throw the switch. We have to realize that it's time to stop and think. We have to recognize a problem or situation that System 1 can't resolve. More crucially, we sometimes have to exert effort to switch to System 2. The prompts might come from someone else ("Wait, let's think about this") or from our hard-earned wisdom that installs a tracker looking for the kinds of hasty judgments and errors that System 1 cranks out all day long.

And that's where I think scientific thinking, as a discipline, often enters the chat. Scientific thinking is the work of System 2. It is systematic, deliberative, and intolerant of bullshit. (I'm using this term in its technical sense.) At its best, it is intentionally and explicitly opposed to intuition. In scientific thought, intuition is barely better than background noise. It can generate hypotheses galore, and it does this regularly in science, but those hypotheses are just fodder for the deliberative work of System 2. My favorite coffee mug says "Science Doesn't Give a F*** What You Believe," and the message is not about faith or believers but about beliefs and intuitions that scientific thought seeks to ignore.

Scientific thinking feels that way to me. I can feel the move in my mind from the constant important work of System 1 to the deliberative work of System 2. Knowing how often my intuition has been a hindrance to my understanding of the world, I welcome the feeling of going to the place where the adult is in charge. Sometimes my intuitions are vindicated. Sometimes they're just irrelevant chaff. Sometimes they are exposed as errors: slapdash guesses, biases, outright prejudice.

Evolution is a great place to observe the folly of human intuition. The next post will use Dan Dennett's 2009 piece Darwin's “strange inversion of reasoning” to explore how evolution challenges some big ancient human intuitions about our world.

No comments: