Behavioral New World
April 1, 2023
On Christmas Eve 1954, a group called The Seekers gathered in Oak Park, Illinois to sing Christmas carols. Nothing unusual about that. But they gathered not for holiday cheer. Instead, they were expecting spaceships full of aliens called ‘The Guardians” to swoop down out of the skies and whisk them away to safety from the catastrophic global flood to follow.
It will not surprise you that they weren’t in fact whisked away (and the flood didn’t happen either). But what followed might surprise you: Rather than reconsider whether such an event was likely or even possible, most members of the group doubled down. That is, they increased their psychological commitment to the notion that the Guardians would still arrive and retrieve them, rather than decrease or eliminate that belief. One explanation they grasped onto: The alien no-show was a test of the Seekers’ strength of belief, “just a drill,” as one of them put it.[1]
This phenomenon is called “escalation of commitment” or “commitment bias.” It is usually discussed in terms of its negative consequences. And if we consider our own lives, and those around us, it’s not uncommon to quickly run into instances of this psychological doubling down. A small-scale example might be the escalation of commitment to a personal relationship that is not working and almost certainly never will work (maybe that’s not small?)
In the realm of finance, one can easily imagine escalation of commitment to the stocks of companies that are fatally flawed and only become more fatally so over time. Or a company escalating its commitment to a business line that objectively seems doomed to fail.
(Historical examples of escalation commitment dating back as far as 1533 can be found in Chapter 1 of the book, When Prophecy Fails by Leon Festinger, Henry W. Riecken, and Stanley Schachter, University of Minnesota Press,1956. Recent research has used the escalation framework to investigate the behavior of QAnon members.[2])
There are several non-mutually exclusive ways to think about this phenomenon. The Hidden Brain podcast, “When You Need It To Be True,”[3] describes it in terms of cognitive dissonance, which refers to the tension felt when someone has two opposing ideas in their head. To resolve the tension, one idea must be discredited so that it can be discarded, ignored, or downplayed.
If you’re a disappointed Seeker then, why not choose the obvious idea? That is, why not conclude that the leader of the Seekers is a charlatan and get on with your life? One answer to that: It makes us uncomfortable to admit we’re wrong. You can think of this as a type of psychological loss aversion (see my July 2020 and October 2022 newsletters)—finding that we’re wrong can hurt more than the satisfaction of finding that we’re right.
Another explanation relates to the “sunk cost fallacy” (April 2022 newsletter): People often give weight to irrecoverable costs. These sunk costs are not just financial, they are psychological as well. For example, one of the Seekers sold his property to pay his debt, spent his Thanksgiving break winding up his affairs, and said goodbye to his parents and friends ahead of The Guardians’ impending arrival. Can you imagine his embarrassment in front of friends, family and the world if he had to admit he was completely wrong? As they say at the poker table, he was “all in.” But sunk costs are irrelevant to current decision-making; the dangerous fallacy is giving them any weight at all.
Here’s another description of the phenomenon:
Once the mind has accepted a plausible explanation for something, it becomes a framework for all the information that is perceived after it. We’re drawn, subconsciously, to fit and contort all the subsequent knowledge we receive into our framework, whether it fits or not. Psychologists call this “cognitive rigidity”. The facts that built an original premise are gone, but the conclusion remains—the general feeling of our opinion floats over the collapsed foundation that established it.[4]
This narrative describes “confirmation bias” (May 2020 and May 2022 newsletters)—our all-too-human tendency to close our minds to information that conflicts with what we already believe. Such information makes us uncomfortable, creates cognitive dissonance, and so we deliberately downplay the contrary information.
What can you as an individual do to resist escalation of commitment? The first step, as always, is an awareness that you might be vulnerable to it. Which you are, by the way. Second, open-minded periodic introspection never hurts (easier said than done).
Third, having a trusted friend or two can help, as long as they feel that they can give you “bad news”. Friend: “You’re going to be picked up by some aliens? Really? That’s nuts!” Now that’s a true friend. Some people even formalize this idea with a Personal Board of Directors, with whom they consult regularly.
Organizations can combat escalation of commitment by using a “devil’s advocate,” someone whose job it is to argue against the prevailing thinking—see Chapter 5 in my book The Foolish Corner (Amazon).
Now you’ve got material for the next cocktail party. Unless of course you grab a ride with The Guardians first.
[1] I’ve simplified the story—there were several “disconfirmation” events prior to December 25, that is, predictions—not borne out—that the Guardians would rescue the believers.
[2] Vyse, Stuart (2021-02-15). "When QAnon Prophecy Fails". Skeptical Inquirer. Committee for Skeptical Inquirer. 45 (3): 21–27.
[3] https://hiddenbrain.org/podcast/when-you-need-it-to-be-true/. Highly recommended because it includes a case of escalation of commitment by an individual as well as a group.
[4] Ryan Holiday, Trust Me, I’m Lying: Confessions of a Media Manipulator, 2012, p.184.