Conspiracy Theories, Part 2

[ This is post #2 in the series, “Finding reality in a post truth world.” ]

In Part 1, I looked at some of the more prominent conspiracy theories, their prevalence, and the dangers they pose. In Part 2, I’ll examine the underpinnings of conspiracy theories, showing how they rest on cognitive fallacies that we all share in common. In Part 3, I’ll go into some of the psychology behind conspiracy thinking and the research on the best methods of combatting it.

Cartoon by Osmani Simanca

It’s tempting to ridicule conspiracy theorists and dismiss them as people with brains that aren’t wired quite right. While these theories are deserving of ridicule, we need to look at the building blocks of conspiracy theories (CT’s), and once we do, we’ll see that these elements are present in everyone’s thinking.

A CT, especially a well developed and popular one, is part of a continuum.

There are six cognitive fallacies that typically comingle in the formation of a CT. Once a CT gains adherents, it commonly becomes the foundation of a cult, although this is not an inevitability. Nor is it an absolute rule that every cult has at its base a conspiracy theory. However, the pairing of these is quite common.

I realize the word “cult” is a loaded term. Typically, it involved religious practices and beliefs, but I’m using it in a more modern sense, as described in this article in The Guardian.

The three main characteristics Ross points out in the article are:

  • A charismatic leader
  • A process of coercive persuasion or thought reform
  • Exploitation of group members by the leader and the ruling coterie.

Some of the primary attributes of cults are especially noteworthy:

  • Absolute authoritarianism without meaningful accountability.
  • No tolerance for questions or critical inquiry.
  • Unreasonable fear about the outside world
  • No legitimate reason to leave, former followers are always wrong in leaving, negative or even evil.
  • Former members often relate the same stories of abuse and reflect a similar pattern of grievances.
  • The group/leader is always right.
  • The group/leader is the exclusive means of knowing “truth” or receiving validation, no other process of discovery is really acceptable or credible.

Put all this together, and it’s clear why cults often use CT’s as the “cement” that binds their membership to the leaders. This is why the cult of Nazism found the Protocols of the Elders of Zion so useful. It was a CT that formed the foundation of everything the Nazis did, the way they did it, and who did it.

QAnon, which is nothing more than a rebranding of the Protocols CT, performs the same function in the quickly emerging cult of Trumpism. The power of the CT within the cult is that it forms the basis of identity, and once it does that, cult members have an especially difficult time disassociating from the group. It’s one thing to change your mind; it’s quite another to give up your identity.

The graphic above also demonstrates a progression of difficulty in disassociation. In other words, rejection of confirmation bias can be fairly simple. Once confirmation bias is merged with the other cognitive fallacies, it’s far more arduous to abandon a conspiracy theory. Still, though, the CT is a mental construct. However, once the CT has become the foundation of a cult, it is exceedingly difficult to separate oneself, since the CT, along with the practices of the cult, have replaced identity.

That means that the understanding of a CT, and how to combat it, must begin with the building blocks of CT. As we examine each one of these, we’ll see that they are not matters of stupidity vs. intelligence, left vs. right, or any other binary explanation. Indeed, each logical fallacy is built into our brain as a feature, not a bug.

The fundamental attribution error

As I look back over the last seven decades of my life, I’m amazed at some of the things I used to believe. When I was in my 20’s, I was pretty sure there was no way humans could have constructed the pyramids in Egypt or managed to perform the incredible stonework of the Incas. I actually thought aliens had come to visit Earth, and these were the signs of their visitations.

All of us have at one time or another had a belief that embarrasses us today. Generally, as we analyze why we held that belief, we don’t ascribe it to stupidity or laziness. Instead, we’re more inclined to put a positive spin on it. Maybe we were under stress. Perhaps someone fed us the wrong information. In other words, we forgive ourselves.

We aren’t so likely to apply this generosity to others. Instead, we use a mental shortcut, something like this: I used to have that idea. I now know it’s stupid. Why doesn’t that person know it’s stupid? They must be stupid.

What’s behind this is incomplete knowledge. We know our own history inside and out. That makes it easy to explain why we held a certain belief. When it comes to others, we have only incomplete knowledge. Our inborne pattern recognition software in our brain jumps into action and categorizes that person. The trouble is, the categorization is generally based on our own history, not theirs, so often it’s just plain wrong.

Motivated reasoning

The human mind dislikes chaos and pure chance. I’ve been privileged to watch my grandchildren from the time they were born. It’s very apparent that one of the greatest forces imprinting their neural networks is a sense of control. This starts at a very primitive level, with a baby touching objects and seeing them move. They do this over and over and over again, until a neural network of cause and effect is burned into their brain.

As adults, we often still reject the ubiquity of pure chance, the idea that luck is probably the dominant aspect of our lives. It’s much more comforting to think that we “can be anything we want to be” or that if we want something, we can just “manifest it.” We easily discard the greatest random act in our lives, the luckiest or unluckiest thing we’ll ever go through – our birth. That I was born as a white male set me on a lucky course, and one that I had absolutely nothing to do with.

The more we ascribe the entirety of life to our own making, the more shocked we are that life doesn’t care. This sets us on a course to discover why. Typically, we discard the reason that Occam’s Razor would dictate: pure chance. Instead, we start looking for the reason, but with a motivation, i.e. that there must be a logical reason. In other words, we employ motivated reasoning.

The more our sensibilities are tied into our very identity, the more motivate our reasoning becomes. This describes the difference between a mother who is simply concerned about the health of her children vs. a full-fledged anti-vaxxer. They both start with the same question: what is best for my child? But the first mother hasn’t tied the answer to her identity. She researches based on scientific consensus. The second mother has absorbed the anti-vax conspiracy theory into her very identity. For her, the consensus of the scientific community is not the end goal. Instead, she spends her time looking for outliers, people who are “bucking the trend” or not “beholden to big pharma.”

This is one of the reasons it’s wrong to ascribe CT’s to stupidity. Many of the people who subscribe to them are thorough researchers. Some even know how to read a raw science article. The problem is that motivated reasoning has them researching the wrong things. And because every scientific theory has outliers, they can always find a scientist who rewards them by validating their identity.

Special pleading

As our CT novice progresses, he inevitably discovers things that might invalidate the CT. If he’s a climate change denier, he might point to an era millions of years ago when the climate was much warmer as a way of denying human induced change — conveniently ignoring the fact that humans didn’t exist then.

The mark of a special pleading is that it’s generally introduced into an argument suddenly, with no scientific evidence to back it up. The QAnon fanatic, upon seeing the gun toting pizzagate investigator terrorize innocent civilians, argues that the person had the wrong coordinates, and that in no way delegitimizes the argument that certain elite people are using pizza joints for child sex trafficking rings.

Ad hominem attacks

Now the CT advocate is nearing the end stage of their edifice. How to get rid of the doubters? Attack them for being stooges of the “deep state” or “in bed with big pharma” or just plain stupid. The ad hominem attack dispenses with the chore of addressing an argument on its merits, discussing the evidence or lack of it, and instead focuses on the messenger.

Here’s what one person wrote to me in response to Part 1 of Conspiracy Theories:

Facts are facts are facts are facts. You’re obviously intelligent, just the type of intelligence that leads to narcissism & arrogance = a FOOL. Someone that leans into his own understanding and confirms your own bias vs actually staring blankly and seeing what facts are able to be grasped and forming a non-biased opinion.

This person ended with a QAnon initialism: WWG1WGA (Where we go one, we go all). Note that the commenter didn’t bother discussing anything about CT’s, their prevalence, or their danger. Instead, the focus was an attack on me. It’s no accident that this tirade included an unironic reference to their own problem, i.e. bias. This happens all the time with people who resort to ad hominem attacks. Remember “Puppet? You’re the puppet! I’m no puppet!”

The QAnon CT, along with its origin Protocols of Zion CT, has a built-in “boogey man” designed for use in ad hominem attacks. In the Protocols CT, it’s the secret cadre of Jews who are controlling the world; QAnon has simply modified this into the “deep state.” This makes easy work of ad hominem dismissal of doubters, with the bonus that it reinforces the base CT.

Don’t make the mistake of assuming this is just an attribute of CT fanatics, though. It happens all the time. It probably stems from a very basic need to protect our “tribe” and distinguish our members from others. A long time ago, there was probably an evolutionary advantage to accepting the beliefs and customs of our own tribe and rejecting those of an outside tribe without even examining them. After all, when resources are scarce, it’s not surprising that “outsiders” would be much more interested in your resources than in you. So an ad hominem approach toward dealing with outsiders is a mental shortcut that keeps you from suffering from naivete.

Confirmation bias

We’re now getting to the cherries on top of the CT cake, the things that trigger that dopamine rush that gratifies not just your logical consciousness but your feelings as well.

In 2000 we bought one of the first hybrids on the market, a Honda Insight. Immediately after that, I started seeing them all over the place. It made me feel good. It confirmed my choice. And the more of them I saw, the better I felt.

In fact, Honda sold fewer than 4,000 of these cars throughout the U.S. Mine was number 763. I probably saw one or two a month, but each sighting stood out far more than a VW or Mercedes or Ford.

All of us have this experience. We buy something, and then we’re amazed how many other people have bought the same thing. Social media engineers know this, so they present us with advertising either related to the product or exactly the same product. I’ve often wondered why they would bother to show me an ad for a product I’ve just bought, but the Netflix documentary, The Social Dilemma, answered this for me. It’s a way of reinforcing the “wisdom” of my purchase by triggering my confirmation bias. Each visual instance of my decision reinforces that decision, and thus convinces me that I’ve made the right choice. It’s a way of developing brand loyalty.

Confirmation bias is a fundamental part of our pattern recognition software in our brain. Imagine how difficult life would be without it. Essentially, we would be unable to learn. One of the reasons that computers have a difficult time with pattern recognition is that they don’t have confirmation bias. That’s why, for example, a CAPTCHA asking you to pick out the squares with a traffic light or part of a traffic light, are effective. A typical bot isn’t going to use a super computer network to solve the CAPTCHA. We, on the other hand, do it instantaneously.

Here’s the danger, though. Using our motivated reasoning, special pleading, and ad hominem attacks, we are left with a very limited field of ideas to explore. We’ve applied a filter to our world, and what we’re left with confirms our bias.

Dunning Kruger effect

Sorry losers and haters, but my I.Q. is one of the highest -and you all know it! Please don’t feel so stupid or insecure, it’s not your fault.

D Trump, Twitter, May 8 2018

I used to think confirmation bias was one of the most dangerous cognitive biases we face, but the last four years makes me think that this prize actually should go to the Dunning Kruger effect.

This term originated with a 1999 study by psychologist David Dunning and his grad student Justin Kruger. They discovered that, as the illustration by Kenneth Lim above shows, the less expertise someone has, the more confidence they have in their knowledge.

An ignorant mind is precisely not a spotless, empty vessel, but one that’s filled with the clutter of irrelevant or misleading life experiences, theories, facts, intuitions, strategies, algorithms, heuristics, metaphors, and hunches that regrettably have the look and feel of accurate knowledge.

Quote from 2014 paper by Dunning, Skeptics Guide to the Universe, p. 47

Ordinarily, it’s easy to dismiss unqualified people who think they’re an expert at this or that. But when the President suffers from an extreme case of Dunning-Kruger, that can be incredibly dangerous. That’s a person who’s described perfectly in the quote above, someone who dismisses the ideas of true experts because they’re convinced that only they can fix the problem, whatever it is.

On a smaller scale, long time CT proponents typically fall into the Dunning Kruger abyss as well. Their motivate reasoning and confirmation bias has convinced them that they are indeed “experts” on a subject. This is what has evidently happened to Robert F. Kennedy Jr., the darling of the anti-vaxxers. He dazzles his audiences with his supposed knowledge of vaccines and their ingredients, but in fact he knows less than a typical pre-med student. His “knowledge” is cherry-picked; it’s based not on the consensus of medical expertise, but on the outliers.

Now that we’ve explored the underpinning of conspiracy theories, the next step is to go into the psychology of why some people latch onto them more than others. Remember, all this starts with the building blocks, the cognitive fallacies that form the foundation of CT, and we all suffer from those logical gaps. But some more than others, and that’s the question at hand. The second question I’ll deal with in the next installment is a critical one: how do we combat CT’s? Do we ridicule them? Do we ignore them? Or do we dissuade CT proponents with cold hard logic?

Please feel free to add your comments and observations. There may be a delay before you see them on this site, because each comment is moderated. If you’re a CT proponent yourself, save your fingers, because your attempt to convince me that Trump is the savior we all need will just end up in the trash can.

Leave a Reply

Your email address will not be published. Required fields are marked *


This site uses Akismet to reduce spam. Learn how your comment data is processed.