Some time in the 1980s, my father said that he had always been opposed to the Vietnam War. My brother asked, appropriately enough, “Then who the hell was that man in our house in the 60s?”
That story is a little gem of how persuasion happens, and how people deny it.
I have a friend who was raised in a fundagelical world, who has changed zir mind on the question of religion, and who cites various studies to say that people aren’t persuaded by studies. That’s interesting.
For reasons I can’t explain, far too much research about persuasion involves giving people who are strongly committed to a point of view new information and then concluding that they’re idiots for not changing their minds. They would be idiots for changing their mind because they’re given new information while in a lab. They would be idiots for changing their mind because they get one source that tells them that they’re wrong.
We change our minds, but, at least on big issues, it happens slowly, due to a lot of factors, and we often don’t notice because we forget what we once believed.
Many years ago, I started asking students about times they had changed their minds. Slightly fewer many years ago, I stopped asking because I got the same answers over and over. And what my students told me was much like what books like Leaving the Fold, books by and about people who have left cults, changed their minds about Hell or creationism, and various friends said. They rarely described an instance when they changed their mind on an important issue because they were given one fact or one argument. Often, they dug in under those circumstances—temporarily.
But we do change our minds, and there are lots of ways that happens, and the best of them are about a long, slow process of recognition that a belief is unsustainable.[1] Rob Schenck’s Costly Grace reads much like memoirs of people who left cults, or who changed their minds about evolution or Hell. They heard the counterarguments for years, and dismissed them for years, but, at some point, maintaining faith in creationism, the cult, the leader of the cult, just took too much work.
But why that moment? I think that people change their minds in different ways partially because our commitments come from different passions.
In another post I wrote about how some people are Followers. They want to be part of a group that is winning all the time (or, paradoxically, that is victimized). They will stop being part of that group when it fails to satisfy that need for totalized belonging, or when they can no longer maintain the narrative that their group is pounding on Goliath. At that point, they’ll suddenly forget that they were ever part of the group (or claim that, in their hearts, they always dissented, something Arendt noted about many Germans after Hitler was defeated).
Some people are passionate about their ideology, and are relentless at proving everyone else wrong by showing, deductively, that those people are wrong. They do so by arguing from their own premises and then cherry-picking data to support that ideology. They deflect (generally through various attempts at stasis shift) if you point out that their beliefs are non-falsifiable. These are the people that Philip Tetlock described as hedgehogs. Not only are hedgehogs wrong a lot—they don’t do better than a monkey throwing darts—but they don’t remember being wrong because they misremember their original predictions. The consequence is that they can’t learn from their mistakes.
Some people have created a career or public identity about advocating a particular faction, ideology, product, and are passionate about defending every step into charlatanism they take in the course of defending that cult, faction, ideology. Interestingly enough, it’s often these people who do end up changing their minds, and what they describe is a kind of “straw that breaks the camel’s back” situation. People who leave cults often describe a sudden moment when they say, “I just can’t do this.” And then they see all the things that led up to that moment. A collection of memoirs of people who abandoned creationism has several that specifically mention discovering the large overlap in DNA between humans and primates as the data that pushed them over the edge. But, again, that data was the final push–it wasn’t the only one.
Some people are passionate about politics, and about various political goals (theocracy, democratic socialism, libertarianism, neoliberalism, anarchy, third-way neoliberalism, originalism) and are willing to compromise to achieve the goals of their political ideology. In my experience, people like this are relatively open to new information about means, and so they look as though they’re much more open to persuasion, but even they won’t abandon a long-time commitment because of one argument or one piece of data—they too shift position only after a lot of data.
At this point, I think that supporting Trump is in the first and third category. There is plenty of evidence that he is mentally unstable, thin-skinned, corrupt, unethical, vindictive, racist, authoritarian, dishonest, and even dangerous. There really isn’t a deductive argument to make for him, since he doesn’t have a consistent commitment to (or expression of) any economic, political, or judicial theory, and he certainly doesn’t have a principled commitment to any particular religious view. It’s all about what helps him in the moment, in terms of his ego and wealth. That’s why defenders of his keep getting their defenses entangled, and end up engaging in kettle logic. (I never borrowed your kettle, it had a whole in it when I borrowed it, and it was fine when I returned it.)
The consequence of Trump’s pure narcissism (and mental instability) and lack of principled commitment to any consistent ideology is that Trump regularly contradicts himself, as well as talking points his supporters have been loyally repeating, abandons policies they’ve been passionately advocating on his behalf, and leaves them defending statements that are nearly indefensible. What a lot of Trump critics might not realize is that Trump keeps leaving his loyal supporters looking stupid, fanatical, gullible, or some combination of all three. He isn’t even giving them good talking points, and many of the defenses and deflections are embarrassing.
For a long time, I was hesitant to shame them, since an important part of the pro-GOP rhetoric is that “libruls” look down on regular people like them. I was worried that expressing contempt for the embarrassingly bad (internally contradictory, incoherent, counterfactual, revisionist) talking points would reinforce that talking point. And I think that’s a judgment that people have to make on an individual basis, to the extent that they are talking about Trump with people they know well—should they avoid coming across as contemptuous?
But for strangers, I think that shaming can work because it brings to the forefront that Trump is setting his followers up to be embarrassed. That means he is, if not actually failing, at least not fully succeeding at what a leader is supposed to do for his followers. The whole point in being a loyal follower is that the leader rewards that loyalty. The follower gets honor and success by proxy, by being a member of a group that is crushing it. That success by proxy comes from Trump’s continual success, his stigginit to the libs, and his giving them rhetorical tactics that will make “libs” look dumb. Instead, he’s making them look dumb. So, pointing out that their loyal repetition of pro-Trump talking points is making them look foolish is putting more straw on that camel’s back.
Supporting Trump, I’m saying, is at this point largely a question of loyalty. Pointing out that their loyalty is neither returned nor rewarded is the strategy that I think will eventually work. But it will take a lot of repetition.
[1] Conversions to cults, otoh, involve a sudden embrace of this cult’s narrative, one that erases all ambiguity and uncertainty.
Tag: persuasion
Trump and the long con
One of the paradoxes of con artists is that cons always depend on appealing to the mark’s desire for a quick and easy solution but the most profitable cons last a long time. How do you keep people engaged in the scam if you’re siphoning off their money?
There are several ways, but one of the most common is to ensure that they’re getting a quick outcome that they like. They’ll often wine and dine their marks, thereby coming across as too successful to need the mark’s money, and also increasing the mark’s confidence (and attachment). They might be supporting that high living through bad checks, but more often with credit cards and money from previous marks, or by getting the mark to pay for the high living without knowing. One serial confidence artist who specialized in picking up divorced middle-aged women on the Internet was particularly adept at stealing a rarely-used credit card from the women while they were showering. He then simply hid the bills when they arrived.
Because he seemed to have so much money, the women assumed he wouldn’t be scamming them, and would then hand over their life’s savings for him to invest.
They do this despite there being all sorts of good signs that the guy is a con artist–his life story seems a little odd, he doesn’t seem to have a lot of friends who’ve known him very long, there’s always some reason he can’t write checks (or own a home or sign a loan). There are three reasons that the con works, and that people ignore the counter-evidence.
First, cons flatter their marks, arguing that the marks deserve so much more than they’re getting, and persuade the marks to have confidence in them. They will tell the marks that those people (the ones who are pointing to the disconfirming data) look down on them, think they’re stupid, and think they know better. The con thereby gets the mark’s ego associated with his being a good person and not a con artist—admitting that he is a con means the mark will have to admit that those people were right.[1] The con artist will spin the evidence in ways that show he’s willing to admit to some minor flaws, ones that make the mark feel that she can really see through him. She knows him.[2]
Second, the con works because we don’t like ambiguity, and we tend to privilege direct experience and our own perception. The reasons to wonder about whether a man really is that wealthy are ambiguous, and it’s second order thinking (thinking about what isn’t there, about the absence of friends, family, connections, bank statements). That ambiguous data will seem less vivid, less salient, less compelling than the direct experience we have of his buying us expensive gifts. The family thing is vague and complicated; the jewelry is something we can touch.
Third, people who dislike complexity, who believe that most things have simple solutions, and that they are good at seeing those simple solutions are easy marks because those are precisely the beliefs to which cons appeal. Admitting that the guy is a con artist means admitting that the mark’s whole view of life—that the world has simple solutions, that people are what they seem to be, that you can trust your gut about whether someone is good or bad, that things you can touch (like jewelry) matter.
And it works because the marks don’t realize that they are the ones who’ve actually paid for that jewelry.
There are all the signs of his being a con artist—all the lawsuits, all the lies, the lack of transparency about his actual wealth, the reports that show a long history of dodgy (if not actively criminal) tax practices, the evidence that shows his wealth was inherited and not earned—but those are complicated to think about. Trump tells people that he cares about them; he (and his supportive media) tell their marks that all the substantive criticism is made by libruls who look down on them, who think they know better. The media admits to a few flaws, and spins them as minor.
Trump is a con artist, and his election was part of a con game about improving his brand. But, once he won the election, he had to shift to a different con game, one that involved getting as much money for him and his corporations as possible, reducing accountability for con artists, holding off investigations into his financial and campaign dealings, and skimming.
And Trump gives his marks jewelry. If you have Trump supporters in your informational world, then you know that they respond to any criticism of Trump with, “I don’t care about collusion; I care about my lower taxes.” (Or “I care about the economy” or “I care that someone is finally doing something about illegal immigrants.”) They have been primed to frame concerns about Trump as complicated, ambiguous, and more or less personal opinion, but the benefit of Trump (to them) as clear, unambiguous, and tangible.
They can touch the jewelry.
And they don’t realize that he isn’t paying for it; he never paid for it, and he never will. They’re paying for it. They bought themselves that jewelry.
There are, loosely, three ways to try to get people to see the con. First, I think it’s useful not to come across as saying that people are stupid for falling for Trump’s cons (although it can be useful to point out that current defenses of Trump are that he’s too stupid to have violated the law). It can be helpful to say that you understand why he and his policies would seem so attractive, but point out that he’s greatly increased the deficit (that his kind of tax cuts always increase the deficit). It’s helpful to have on hand the data about how much “entitlement”programs cost. Point out that they will be paying for his tax cuts for a long, long time.
Another strategy is to refuse to engage and just keep piling on the evidence. People get persuaded that they’ve been taken in by a con artist incident by incident. It isn’t any particular one, but that there are so many, and they reject each one as it comes along. So, I think that sharing story after story about how corrupt Trump is, how bad his policies are,and what damage he is doing—even if (especially if) people complain about your doing so—is effective in the long run.
Third, when people object or defend Trump, ask them if they’re getting their information from sources that would tell them if Trump were a con artist. They’ll respond with, “Oh, so I should watch MSNBC” (or something along those lines) and the answer is: “Yes, you should watch that too.” Or, “No, you shouldn’t get your news from TV.” Or a variety of other answers, but the point is that you aren’t telling them to switch to “librul” sources as much as getting more varied information.
Con artists create a bond with their marks—their stock in trade is creating confidence. They lose power when their marks lose confidence, and that happens bit by bit. And sometimes it happens when people notice the jewelry is pretty shitty, actually.
[1]This is why it’s so common for marks to start covering for the con when the con gets exposed. They fear the “I told you so” more than the consequences of getting conned.
[2] In other words, con artists try to separate people from the sources of information that would undermine the confidence the mark has in the con.
Rough draft of the intro for the Hitler and Rhetoric book
[Much of this is elsewhere on this blog. I’m curious if I’m still having the problem of being too heady and academic.]
Martin Niemoller was a Lutheran pastor who spent 1938-1945 in concentration camps as the personal prisoner of Adolf Hitler. Yet, Neimoller had once been a vocal supporter of Hitler, who believed that Hitler would best enact the conservative nationalist politics that he and Niemoller shared. Niemoller was a little worried about whether Hitler would support the churches as much as Niemoller wanted (under the Democratic Socialists, the power of the Lutheran and Catholic churches had been weakened, as the SD believed in a separation of church and state), but Neimoller thought he could outwit Hitler, get the conservative social agenda he wanted, disempower the socialists, and all without harm coming to the church. He was wrong.
After the war, Niemoller famously said about his experience:
First they came for the Socialists, and I did not speak out—
Because I was not a Socialist.
Then they came for the Trade Unionists, and I did not speak out—
Because I was not a Trade Unionist.
Then they came for the Jews, and I did not speak out—
Because I was not a Jew.
Then they came for me—and there was no one left to speak for me.[1]
Niemoller was persuaded that Hitler would be a good leader, or, at least, better than the Socialists. After the war, Niemoller was persuaded that his support for Hitler had been a mistake. What persuaded him either time?
Christopher Browning studied the Reserve Police Battalion 101 and its role in Nazi genocide, narrating how a group of ordinary men could move from being appalled at the killing of unarmed noncombatants to doing so effectively, calculatedly, and enthusiastically. German generals held captive by the British were wiretapped, and often talked about how and why they supported Hitler, many of whom had been opposed to him. In 1950, Milton Mayer went to visit the small German town from which his family had emigrated and talked to the people living there, writing a book about his conversations with ten of them, all of whom to some degree justified not only their actions during the Nazi regime, but the regime itself—even those who had at points or in ways resisted it. Melita Maschmann’s autobiographical Account Rendered, published in 1963, describes how she reconciled her Hitler Youth activities, which included confiscating property and helping to send people to camps, with her sense that National Socialism was idealistic and good. Robert Citino’s The Wehrmacht Retreats, David Stone’s Shattered Genius, and Ian Kershaw’s The End all describe how so many members of the German military elite not only reconciled themselves to working for Hitler, but to following orders that they believed (often correctly) meant disaster and defeat. Benny Morris’ Roots of Appeasement gives a depressing number of examples of major figures and media outlets that persuaded others and were persuaded themselves that Hitler was a rational, reasonable, peace-loving political figure whose intermittent eliminationist and expansionist rhetoric could be dismissed. Andrew Nagorski’s Hitlerland similarly describes American figures who were persuaded that Hitler wouldn’t start another war; accounts of the 1936 Olymplic Games, hosted by the Nazis, emphasize that Nazi efforts were successful, and most visitors went away believing that accounts of anti-Jewish violence and discrimination were overstated. Biographers of Hitler all have discussions of his great rhetorical successes at various moments, enthusiastic crowds, listeners converted to followers, and individuals who walked out of meetings with him completely won over. Soldiers freezing to death in a Russian winter wrote home about how they still had faith in Hitler’s ability to save them; pastors and priests who believed that they were fighting to prevent the extermination of Christianity from Germany still preached faith in Hitler, blaming his bad advisors; ordinary Germans facing the corruption and sadism of the Nazi government and the life-threatening consequences of Hitler’s policies similarly protection their commitment to Hitler and bemoaned the “little Hitlers” below him who were, they said, the source of the problems. The atrocities of Nazism required active participation, support, and at least acquiescence on the part of the majority of Germans—the people shooting, arresting, boycotting, humiliating, and betraying victims of Nazism were not some tiny portion of the population, and those actions required that large numbers walk by. Some people were persuaded to do those things, and some people were persuaded to walk past.
After the war, what stunned the world was that Germans had been persuaded to acts of irrationality and cruelty previously unimaginable. Understanding what happened in Germany requires understanding persuasion. And understanding persuasion means not thinking of it as a speaker who casts a spell over an audience and immediately persuades them to be entirely different. Rhetoric, which Aristotle defined as the art of finding the available means of persuasion, isn’t just about what a rhetor (a speaker or author) consciously decides to do to manipulate a passive audience. What the case of Hitler shows very clearly is that we are persuaded by many things, not all of them words spoken by a person consciously trying to change our beliefs. Rhetoric helps us understand our own experience, and the most powerful kind of persuasion is self-persuasion. What a rhetor like Hitler does is give us what scholars of rhetoric call “topoi” (essentially talking points) and strategies such that we feel comfortable and perhaps deeply convinced that a course of action is or was the right one. Rhetoric is about justification as much as motivation. That isn’t how people normally think about persuasion and rhetoric, and, paradoxically, that’s why we don’t see when we’ve been persuaded of a bad argument—because we’re wrong about how persuasion works.
This book is about Hitler, and yet not about Hitler. It’s really about persuasion, and why we shouldn’t imagine persuasion as a magically-gifted speaker who seduces people into new beliefs and actions they will regret in the morning. It’s never just one speaker, it’s never just speech, it’s never even just discourse, the beliefs and actions aren’t necessarily very new, and people don’t always really regret them in the morning.
[1] There are various versions. This one is from here: https://www.ushmm.org/wlc/en/article.php?ModuleId=10007392
Conditions that make persuasion difficult
A lot of people cite studies that show that people can’t be persuaded. As though that should persuade people not to try to persuade others.
That isn’t even the biggest problem with those studies. The studies are often badly designed (no one should be persuaded to change an important belief by being told by one person in a psych experiment that they’re wrong). And the studies aren’t generally designed to keep in mind what the research on persuasion does show–that some conditions make it more difficult to persuade people.
I was going to put together a short handout for students about why the paper they’re writing is so hard (an ethical intervention in one of several possible situations, ranging from arguing against the Sicilian Expedition to arguing for retreating from Stalingrad), and ended up writing up a list of the biggest obstacles.
An opposition (i.e., already come to a decision) audience that has:
-
- Taken the stance in public (especially if s/he has taken credit for it being a good idea or otherwise explicitly attached her/his ego/worth to the position);
- Suffered for the position, had someone loved suffer, or caused others to suffer (e.g., voted for a policy that caused anyone to be injured)
- Equated the idea/position with core beliefs of his/her culture, religion, political party, or ideology (since disagreement necessarily becomes disloyalty);
- Been persuaded to adopt the position out of fear (especially for existence of the ingroup) or hatred for an outgroup;
- Is committed to authoritarianism and/or naïve realism (equates changing one’s mind with weakness, illness, sin, or impaired masculinity; is actively frightened/angered by assertions of uncertainty or situations that require complex cognitive processes);
- Does not value argumentative “fairness” (insists upon a rhetorical “state of exception” or “entitlement”—aka “double standard”—for his/her ingroup);
- Has a logically closed system (cannot articulate the conditions under which s/he would change her/his mind).
A culture that
-
- Demonizes or pathologizes disagreement (an “irenic” culture);
- Is an honor culture (what matters is what people say about you, not what is actually true, so you aren’t “wrong” till you admit it);
- Equates refusing to change your mind with privileged values (being “strong,” “knowing your mind,” masculinity) and“changing your mind” with marginalized values (being “weak,” “indecisive,” or impaired masculinity);
- Enhances some group’s claim to rhetorical entitlement (doesn’t insist that the rules of argumentation be applied the same across groups or individuals);
- Has standards of “expertise” that are themselves not up for argument;
- Promotes a fear of change;
- Equates anger and a privileged epistemological stance.
A topic
-
- That results from disagreement over deep premises;
- About which there is not agreement over standards of evidence;
- That makes people frightened (especially about threats from an outgroup);
- That is complicated and ambiguous;
- That is polarized or controversial, such that people will assume (or incorrectly) infer your affirmative position purely on the basis of any negative case you make (e.g., If you disagree with the proposition that “Big dogs make great pets because they require no training” on the grounds that they do require training, your interlocutor will incorrectly assume that you think [and are arguing] that big dogs do not make great pets);
- That is easily framed as a binary choice between option A (short-term rewards [even if higher long-term costs] or delayed costs [even if much higher]) and option B (delayed rewards [even if much higher] or short-term costs [even if much lower than the long-term costs of option A]).