With the passing of Daniel Kahneman on March 27, the world lost a distinguished intellect. A Nobel prizewinner, his impact on economics and the social sciences more broadly has proved enduring and profound. Case in point: his ideas underlie and prompt much of the advice behavioral scientists give meteorologists seeking to improve societal response to weather warnings. For decades I’ve been “in the room,” in a variety of settings, as sociologists, psychologists, and risk communicators have patiently explained to meteorologists the rudiments of anchoring, cognitive bias, framing, heuristics, optimism bias and much, much more. Meteorologists have been a tough sell, but over time our community has come to acknowledge that communication of weather risk must take into account human behavior every bit as much as the actual or future weather itself. (That is, if we truly wish to achieve desired individual and societal uptake and outcomes – reduced fatalities, injury, property loss, business disruption, and all the rest).
Kahneman was an extraordinary scholar and a prolific writer, but in that constellation of work one star stands out: Thinking, Fast and Slow. An excerpt from the Wikipedia article:
Thinking, Fast and Slow is a 2011 popular science book by psychologist Daniel Kahneman. The book’s main thesis is a differentiation between two modes of thought: “System 1” is fast, instinctive and emotional; “System 2” is slower, more deliberative, and more logical.
The book delineates rational and non-rational motivations or triggers associated with each type of thinking process, and how they complement each other, starting with Kahneman’s own research on loss aversion. From framing choices to people’s tendency to replace a difficult question with one which is easy to answer, the book summarizes several decades of research to suggest that people have too much confidence in human judgment.
The book, a 400+ page tome, sold over one million copies. It consists of several sections. To quote Wikipedia further:
In the book’s first section, Kahneman describes two different ways the brain forms thoughts.
…The second section offers explanations for why humans struggle to think statistically. It begins by documenting a variety of situations in which we either arrive at binary decisions or fail to associate precisely reasonable probabilities with outcomes. Kahneman explains this phenomenon using the theory of heuristics. Kahneman and Tversky originally discussed this topic in their 1974 article titled Judgment Under Uncertainty: Heuristics and Biases.
(examples of each follow)
Wikipedia goes on:
…Kahneman describes a number of experiments which purport to examine the differences between these two thought systems and how they arrive at different results even given the same inputs. Terms and concepts include coherence, attention, laziness, association, jumping to conclusions, WYSIATI (What you see is all there is), and how one forms judgments. The System 1 vs. System 2 debate includes the reasoning or lack thereof for human decision making, with big implications for many areas including law and market research.
The second section deals with heuristics and biases. Wikipedia notes:
Kahneman uses heuristics to assert that System 1 thinking involves associating new information with existing patterns, or thoughts, rather than creating new patterns for each new experience. [emphasis added; see below] For example, a child who has only seen shapes with straight edges might perceive an octagon when first viewing a circle. As a legal metaphor, a judge limited to heuristic thinking would only be able to think of similar historical cases when presented with a new dispute, rather than considering the unique aspects of that case. In addition to offering an explanation for the statistical problem, the theory also offers an explanation for human biases.
The shortcomings of human thought? These include: Anchoring. Availability. Conjunction fallacy. Optimism and loss aversion. Framing. Sunk cost. In the book, Kahneman discusses each at length. The result is an impressive indictment of human ability to think rationally.
Being merely human myself, in the face of this intimidating list of rational shortcomings I want to fall back on System 1 thinking and (as noted above) associate this new information with existing patterns, or thoughts, rather than create new patterns for each new experience.
Aha! I do indeed have such a pre-existing thought/memory to lean on.
It’s the 1960’s. I’m an undergraduate physics major at Swarthmore College, and I’m subscribing to the American Journal of Physics. In my August 1963 issue I encounter an article entitled Batting the Ball, by Paul Kirkpatrick. The abstract reads:
The velocity vector of a ball struck by a bat is a stated function of the ball and bat velocities, bat orientation, and certain constants. In the light of the equations of the collision, the operation and the consequences of swinging the bat are analyzed, and the role of the constants is discussed.
Utterly fascinating! Physics rules! I eagerly shared these insights with my mathematician/statistician father, who had played quite a bit of baseball in his younger days, and was a lifelong, ardent fan. This would be our chance to bond!
“Dad, look at this article, which lists 90+ parameters and actions a batter has to adjust to hit a baseball!”
My father’s response was swift – and dismissive.
“If I were a pitcher, I’d be sure to send copies of this article to the opposing team.”
Hmm. Not quite the response I had expected. Or hoped for. But so spot on. In the 60 years since, I’ve thought of this incident again and again.
Which brings us back to Daniel Kahneman. His body of work is extraordinarily useful for critiquing the decision making under uncertainty facing all of us when we are confronted with daily weather information, especially when that information matters most. But what about the implications of this work for anyone contemplating writing a research paper on any subject, or a thought piece of any type – whether an article or book; or a blogpost or podcast? Facing so many pitfalls of thinking to avoid, how do we actually move forward? What misplaced pride possibly allows us to imagine we’ll be making a positive contribution, instead of adding to the world’s already stupefying stock of faulty thinking and misinformation?
Whew! What’s a thinker to do? I’ll have more in the next LOTRW post, but in the meantime,
Your (fast and slow) thoughts, please.
You can either be a centipede unable to start because you can’t figure out which foot to move first, or a bee going from flower to flower, trailing pollen behind. As Bob Dylan said, “Don’t think twice, it’s all right.”