
My grandmother had a “joke” (really more of a parable) about a guy who sees a pie cooling in the window, and steals it. Unfortunately, he leaves a perfect handprint on the sill, so he sneaks into the house in order to wash off his handprint. But then it’s obvious that the sill has been washed, since it’s so much cleaner than the wall. So he washes that wall. It’s still obvious that something has happened because that one wall is so much cleaner than the others. When the police came, he was repainting the attic.
You can tell this as a shaggy dog joke, with more of the steps between the sill and the attic. And, in a way, that’s how this situation often plays out, at least in regard to bad choices. Rather than admit the initial mistake, we get deeper and deeper into a situation; the more energy we expend to deflect the consequences of that first mistake, the more committed we are to making that expenditure worthwhile. So, we’re now in the realm of the “sunk cost” fallacy/cognitive bias. Making decisions on the basis of trying to retrieve sunk costs—also known as “throwing good money after bad”–enables us to deny that we made a bad decision.
In the wonderful book Mistakes Were Made, Carol Tavris and Elliot Aronson call this process “the pyramid of choice.” It’s usefully summarized here:
“The Analogy of the Pyramid (Tavris and Aronson, 2015). An initial choice -which is often triggered by the first “taking the temperature” vote -amounts to a step off on one side of the pyramid. This first decision sets in motion a cycle of self-justification which leads to further action (e.g., taking a public stance during the group discussion) and further selfjustification. The deeper down participants go, the more they can become convinced and the more the need arises to convince others of the correctness of their position.”
The example used by Tavris and Aronson is of two students who are faced with the choice of cheating or getting a bad grade on an exam. They are, initially, both in the same situation. One decides to cheat, and one decides to get the bad grade. But, after some time, each will find ways of not only justifying their decision, but they will be “convinced that they have always felt that way” (33).
In the equally wonderful book Denial, Jared Del Rosso describes a similar process for habituating a person to behaviors they would previously have condemned (such as engaging in torture). A prison guard or police officer is first invited to do something a little bit wrong; that small bad act is normalized, and then, once they’ve done that, it becomes easier to get them to do a little worse (Chapter 4). Christopher Browning describes a similar situation for Nazi Wehrmacht soldiers who participated in genocide; Hannah Arendt makes that argument about Adolf Eichmann; Robert Gellately makes it about Germans’ support for Hitler.
It’s like an upside-down pyramid—the one little bad act enables and requires more and worse ones, since refusing to continue doing harm would require admitting to one’s self and others that the first act was bad. It means saying, “I did this bad (or stupid) thing,” and that’s hard. It’s particularly hard for people who equate identity and action, and who believe that only bad people do bad things, and only stupid people do stupid things; that is, people who believe in a stark binary of identity.
This way of thinking also causes people to “double down” on mistakes. In late 1942, about 250,000 Nazi soldiers approaching and in Stalingrad were in danger of getting encircled by Soviet troops. Hitler refused to allow a retreat; instead opting for Goering’s plan of airlifting supplies. David Glantz and Jonathan House argue that Hitler was “trapped” by his previous decisions—to acknowledge the implausibility of Goering’s proposal (and it was extremely implausible) would amount to Hitler admitting that various decisions that he had made were wrong, and that his generals had been right. Glantz and House don’t mean he was actually trapped–other decisions could have been made, but not by Hitler. He was trapped by his own inability to admit that he had been wrong. Rather than admit that he was wrong in his previous bad decisions, he proceeded to make worse ones. That’s the pyramid of harm.
The more walls the thief washes, the harder it is to say that the theft of the pie was a one-time mistake.