As someone who has been teaching argumentation for a long time, I’ve found puzzling a lot of the ways that people approach and think about argument. One of them is the tendency to radicalize the opposition argument, taking an opposition argument that has hedging and modifiers (often, sometimes, rarely, frequently, occasionally, infrequently, tends) and recharacterize that argument as an extreme claim (“sometimes” becomes “always” and “infrequently” becomes “never”). So, if Chester claims, “The squirrels tend to try to get the red ball when it’s easy,” Hubert says, “Chester believes that the squirrels never do anything but try to get to the red ball.”
Notice two things about that recharacterization: Hubert has framed the issue as a question of Chester’s beliefs, not his argument, and he’s radicalized Chester’s argument.
At first, I thought it was because I was a grad student teaching in the Rhetoric Department at Berkeley. That department attracted a lot of aspiring lawyers, and many (most?) of them had had debate experience. I thought students to often radicalized opposition arguments was because radicalizing your opponent’s argument was debate weeny move 101 (and one any good judge or opposition team would catch).
But then I moved to colleges where debate training was rare, and I noticed how common that shift from a modified claim to an extreme one was still common. I caught myself doing it (especially when angry or frightened), as well as colleagues (in rhetoric, who should know better), pundits, editorials, people complaining about spouses, partners, room-mates.
Perhaps because of my training, I had always thought of it as a deliberate misrepresentation of the opposition, a conscious use of the straw man fallacy.
But then I ran across relationship advice that said, essentially, if you hear yourself saying (or thinking), “You never…” or “You always….” you aren’t in the realm of talking to the person in front of you. It’s pretty unlikely that the person in front of you—spouse, partner, room-mate—has literally never done the dishes, or helped around the house, or taken out the trash. They probably washed a glass here and there, or wiped off a spill, or took out one piece of trash. It’s unlikely that they always interrupt you, leave dirty dishes in the sink, or talk on the phone. There are hours in the day when they aren’t, at that moment, interrupting you.
Because those accusations aren’t true, a person who treats relationship arguments in bad faith (they’re just trying to get their way and not solve the problem) can dismiss your claim by pointing out that they once did dishes, or are not, at this moment, interrupting you. A person who treats relationship arguments in good faith has a really hard time figuring out how to respond to such hyperbolic claims. That’s really good relationship advice—listen to yourself when you’ve radicalized their behavior.
It doesn’t work for people who see relationships as zero-sum battles between the two people, and who like it that way; it takes the fun out for them. They like the big blow-up arguments that are all about throwing hyperbolic accusations at one another (and sometimes physical objects) and the makeup sex afterwards. YKINMKBYKIOK
But I found it to be good advice for me—to pay attention to when I was radicalizing someone else’s argument. And then I realized it’s really good advice for policy deliberation. And I don’t mean just national politics, but I noticed that intra-departmental policy arguments (what should we do about the photocopier) often triggered all-or-nothing thinking in some people. In faculty meetings, a person would say, “I’m concerned because I think this policy might lead to [this outcome] under these circumstances,” and someone would respond with, “So you’re saying [this outcome] would always happen,” and then they would engage in a long speech about how silly it was that their opposition would think it would always happen. Smart people, people trained in close reading, radicalized the claims of people with whom they disagreed. And they hadn’t been trained in debate.
And, working individually with students, I found that they could read a nuanced argument, and, if it was in-group or confirmed their beliefs, they could read it with nuance, but if it disagreed with them, they radicalized it.
The tendency to radicalize what they believe, in my experience, is pretty rarely a strategic and conscious rhetorical choice. I’ve come to think it’s entirely sincere. I’m not going to say that “both sides” do it, because I think the whole notion that our nuanced, vexed, and rich array of political options can be reduced to two sides (or a continuum) is not only empirically false, but proto-demagogic.
I will say that many people all over the political spectrum, and in the realm of non-partisan policy issues (such as what policy should we have in our house about doing dishes), radicalize the beliefs of anyone who disagrees with them, and all of us often radicalize the beliefs of people who disagree with us, especially under certain circumstances.
My crank theory is that there are various conditions that make people prone to radicalize the opposition:
1) That’s how some people think. Honestly, this is, I think, the most common explanation. There are people who can’t think in nuanced terms, or understand probability. They think in extreme terms. They’re the kind of people who, if the weather predictors say, “There is a 90% chance of rain,” and if it doesn’t rain, they say the weather predictors were wrong.
Lots of people in our muckled public/private realm engage in hyperbole, and so do these people. If something is bad, it’s the worst thing ever; if something is good, it’s the best thing ever. But these people talk that way because that’s really how they think—their in-group is entirely good, and made up of good people who all agree as to what is good, and anyone who doesn’t agree with them is entirely bad. You are either in-group (double plus good) or out-group (double plus ungood).
They have a lot of trouble admitting that in-group people are deeply flawed or out-group people have any virtues at all. Because they think everyone thinks in such all-or-nothing terms, they project that way of thinking onto everyone else. They read “often” as “always” because that’s what it really means to them.
2) That’s how all of us think in situations when we have been effectively inoculated against the opposition. Inoculation https://www.patriciarobertsmiller.com//2019/07/28/democracy-and-inoculation/ works by giving us a weak version of “the” opposition argument. It’s generally paired with training us to associate certain terms or positions with one opposition (so, all feminists are lesbians who want to convert all women to lesbianism, if this woman says she thinks women are unfairly treated, she must be a feminist lesbian missionary).
3) It’s also common if we’re naïve realists—if we believe that our position is the only possible reasonable position, then we are prone to reframe all opposition arguments as arguments. That is, we radicalize them.
4) If we believe that there are only two positions on every political issue, then we’re going to throw all unreasonable positions into the “other” side. We all tend to think of our in-group as nuanced, heterogeneous, and diverse, but the out-group as essentially all the same. So, if I believe that vaccines are great, and someone says they’re not wild about the HPV vaccine, I’m likely to assume that they’re opposed to all vaccinations.
I think the unconscious (and sometimes deliberate, when it’s part of inoculation) of “the” opposition is one of the major contributors to our culture of demagoguery.
It’s common now to say that we’re in a bad situation because civics is no longer taught, and there certainly seem to be an awful lot of people who don’t seem to understand some of the basic features of our governmental system (all over the political spectrum), but I think more important is that we don’t teach logic.
I don’t mean formal logic, but the very straightforward, and yet very challenging, skill of teaching students to recognize various fallacies, like straw man, not when those people engage in it, but when we do.
Aaron Beck who founded cognitive therapy and his school have outlined common cognitive distortions. You may want to look them up. He applies these distortions mainly in a therapeutic setting, but they might apply to rhetoric too.
The example you use is an example of all or nothing thinking and emotional reasoning and discounting the positive.
People have to be motivated to change their distress in order to change their habits of thought.
Beck studied just what you’re studying, but in a different context