- Think About It
- Posts
- Why Some People Defend The Indefensible
Why Some People Defend The Indefensible
On why facts and evidence are not enough to change minds.
Hello, curious souls. I hope everyone is safe and doing alright. Welcome back to another edition of Think About It.
As the US-Israeli war on Iran rages on, the political debates have turned fierce. It usually happens during wars: there are people who criticise the leaders and their decisions, and there are people who passionately defend them and their decisions.
Now, for a long time, I wondered why some people defend political parties and leaders so passionately, even when some of these leaders are linked with heinous crimes.
So, I finally decided to do some digging. Psychology and philosophy provided some answers, and it’s really interesting…
The behaviour of defending leaders despite their wrongdoings usually isn't about the facts of the crime; it’s more to do with how our brains are wired to handle belonging and safety.
People decide what they want to believe first, and then find the "facts" to support it later. Psychologists call this Identity Motivated Reasoning.
Let’s say if someone identifies strongly as a member of Party A, then their goal isn't to find the truth; it's to protect their identity as a Party A person.
If you follow any team sport like me, then you’ll understand this perfectly.
When your favourite team plays dirty, you don't usually say "they're wrong." Instead, you say "the referee is biased" or "the other team started it." This is because their loss feels like your loss. Their shame feels like your shame. Political identity works the same way, but emotional stakes are much higher.
There’s another ‘defense shield’ brain holds up that psychologists call Identity Protective Cognition.
It happens when learning a new fact would force people to lose their status in their group or change how they see themselves.
So, if someone admits their leader committed a crime, they have to admit that they were wrong for supporting them. To many people, admitting they were wrong feels like a "personal dishonour" or a loss of face. So, to avoid that pain, their mind dismisses the evidence. By protecting their leader, they are actually protecting their identity (who they believe they are).
Because of Identity Motivated Reasoning, the leader’s reputation and the follower’s reputation become the same thing.
If the leader is "evil," then the follower feels like they are a person who supports evil. To avoid feeling like a bad person, the follower uses Identity Protective Cognition to convince themselves that the leader is actually innocent or there’s some agenda against them.
It’s especially infuriating when we see intellectuals defending politicians linked with crimes. It’s tempting to think, "How can they be so blind?"
Dan Kahan, Professor of Law at Yale Law School, puts it quite bluntly:
"People use their intelligence to defend identity, not to seek accuracy."
Kahan’s research shows that smart people aren't blind; they are just using their brainpower as a weapon.
Usually, we think of intelligence like a lens that helps us see the world more clearly and get closer to the truth. Kahan, however, argues that when people’s identity is at stake, intelligence becomes a shield.
Smart people can find complex reasons to explain away evidence, dismiss experts, or flip the narrative so that the leader looks like the victim. In doing that, they aren't seeking "accuracy" (the truth); they are seeking "justification" (a reason to stay on the team).
Long before modern psychology, many of history’s greatest thinkers noticed that humans prefer "belonging" to "being right." They saw that people often treat leaders like extensions of themselves.
Writing over 2,000 years ago about the wars in Ancient Greece, Thucydides observed that when people get caught up in political "factions" (groups), words actually change meaning.
He noticed that reckless aggression was praised as "loyal courage," while someone who tried to see both sides was called a "coward."
In the 1600s, philosopher Francis Bacon identified "glitches" in the human mind that prevent people from seeing the truth. He called them Idols.
He proposed the idea that we all live in our own personal "cave" (our background, our environment, people we surround ourselves with, etc.). And we see the world through the lens of that cave. So, if the leader is in that “cave”, people are going to defend him.
In his book The Crowd, Gustave Le Bon (the 19th-century polymath), argued that when individuals join a crowd or a movement, their personal judgment dissolves into the group's emotional current.
He called this "mental contagion." The crowd becomes its own organism, and it is characterised by impulsivity, irritability, inability to reason, and the contagious spread of emotions. He also said that the crowd is often driven by instinct rather than intellect.
Le Bon’s work later influenced figures like Hitler and Mussolini, who read him carefully. It’s disturbing to read because what Le Bon said is still largely accurate.
So, history shows us that this isn’t something new. But then, framing this purely as a psychological flaw of other people can be limiting. The uncomfortable truth is that everyone does this to some degree (including those who criticise political loyalists).
Political loyalty, at its extreme, is rarely about politics.
Identity, fear, belonging, fear of the alternative, media ecosystems, historical trauma, and the deep human need for certainty in an uncertain world all feed into it.
That’s all for today. In the next edition, I will discuss and dive deep into Nietzsche’s ideas on truth and why humans don’t naturally pursue it.
Until then
Stay curious
Zaid
PS: Everything I write is free. And I intend to keep it that way. But what I do takes time and effort. If something I wrote made you think differently or gave you a moment of clarity, or you just found it valuable, then please consider supporting my work.
buymeacoffee.com/za1d
There’s no pressure or obligation. Thanks for being a subscriber and a reader.
Reply