Information Networks

[ This is post #8 in the series, “Finding reality in a post truth world.” ]

Two definitions:

dilettante (plural dilettanti or dilettantes)

  1. An amateur, someone who dabbles in a field out of casual interest rather than as a profession or serious interest.
  2. (sometimes offensive) A person with a general but superficial interest in any art or a branch of knowledge.

amateur (plural amateurs)

  1. (now rare) A lover of something. quotations 
  2. A person attached to a particular pursuit, study, or science as to music or painting; especially one who cultivates any study or art, from taste or attachment, without pursuing it professionally.
  3. Someone who is unqualified or insufficiently skillful.

The line between amateur and dilettante is precariously thin. In today’s parlance, dilettante = “poser”. In terms of cognitive fallacies, the Dunning-Kruger effect comes to mind.

The only way to avoid this dilemma is to become an expert.

An expert is somebody who has a broad and deep competence in terms of knowledge, skill and experience through practice and education in a particular field. Informally, an expert is someone widely recognized as a reliable source of technique or skill whose faculty for judging or deciding rightly, justly, or wisely is accorded authority and status by peers or the public in a specific well-distinguished domain. An expert, more generally, is a person with extensive knowledge or ability based on research, experience, or occupation and in a particular area of study.

https://en.wikipedia.org/wiki/Expert

At best, given the two million peer reviewed articles published each year, you can be an expert in one or two narrow fields. The idea that expertise in one subject makes you an totally competent in another is, unfortunately, nonsense. This is why radiologists like Dr. Scott Atlas should not pretend to be epidemiologists, and why Nobel Prize winning chemists like Linus Pauling can fail so miserably when it comes to medical cures.

So most of us, in most areas, are at best amateurs. Amateurs, by definition, are not experts; in other words, we don’t know the “truth” about the subjects we’re interested in. Our training in another area doesn’t necessarily give us a leg up. Yes, it may give help us be more rigorous in pursuing our interest, but in the end, we’re all in the same amateur boat.

This poses a huge dilemma for us in an increasingly complex world. Five hundred years ago, the average peasant knew how to recreate most things he depended upon. They grew their own crops, new how to repair their farming tools, etc. Today, the average citizen might have a screwdriver or two, a hammer, and a measuring tape, but that’s about it. We don’t know very much at all about how our house was built, how our phone works, how the internet functions, what makes the car dashboard pop up a message when a servicing is due. We rely on experts.

“Trust your gut,” that ubiquitous cliche from TV and movies, doesn’t serve us very well here. It typically results in a severe case of Dunning-Kruger, and makes us look like a pompous fool. Yet millions of people do just that in virtually every aspect of their lives, and all of us do it in some aspect of our lives.

How do we avoid that trap? Part of it is by learning to spot the most common cognitive fallacies and taking counter measures on our own. That’s the focus of the penultimate part of this series. Part of it is learning to rely on the scientific consensus in a given area, the focus of the final part of the series. The first leg of this three legged stool is our “knowledge network,” the circle of resources we turn to when we’re not sure about something in our environment.

This is the subject of the book, The Misinformation Age – How False Beliefs Spread, by Calin O’Conner and James Owen Weatherall. The book relies heavily on Bayesian statistics, but their main point is understandable without any competence in that arena:

We live in an age of misinformation – an age of spin, marketing, and downright lies. Of course, lying is hardly new, but the deliberate propagation of false or misleading information has exploded in the past century, driven by both new technologies for disseminating information – radio, television, the internet – and by the increased sophistication of those who would mislead us.

After Trump’s inauguration, we were treated to a mass demonstration of how social pressure can influence our sensory judgements.

photo from CNN

Trump insisted that his inauguration crowd, on the right, dwarfed Obama’s, on the left. Clearly, that is absurd, but it was amazing how many of his supporters echoed his lie.

In 1951, Solomon Asch as Swarthmore College devised an experiment demonstrating this. He showed groups of eight participants a card like the one below.

The Misinformation Age, p. 81

He did not tell one of the volunteers that 7 out of the 8 were assigned to choose the wrong line – either “b” or “c”. The subject not in on the “conspiracy” had to give their answer last. They could either choose the correct answer, “a”, or go with the consensus of the group. One third of the subjects went along with the group, not wanting to disagree with everyone around them, or perhaps not trusting their own judgement.

O’Conner and Weatherall delve into various types of informational networks and Baysian probabilities. The average reader will very likely have trouble following the math, as do I, so I’ll try to summarize their main points.

First, we don’t generally come to conclusions about anything ourselves. Let’s say we’re trying to figure out if a certain weed killer is safe. Unless we’ve already studied this subject thoroughly, we generally will turn to our informational network, which is probably made up of some people we trust completely and others that we’re not so sure about. We will give them varying levels of credence, based a lot on our past experience with them. For example, we might have a very opinionated uncle who has lots of information, but tends to put a conspiratorial slant on everything. So that person might have a “credence percentage” of 20%. Another friend might be a scientist who isn’t an expert in the subject we’re looking at, but who is extremely objective. Subconsciously, we might give that person a credence percentage of 80%. As time goes on, we adjust these percentages, based on how accurate each person is on various issues. In that way, we eventually build information networks that we believe will give us reliable answers.

The question is, what do we consider to be the “right” answer? If we are consumed with confirmation bias, it will be the one that most closely aligns with the opinion we already have. The presence of others who are not influenced by confirmation bias may, in time, mitigate our reliance on biased members of our network.

O’Connor and Weatherall demonstrate how destructive a “propagandist” is in one’s informational network. Unlike other members, the propagandist is not influenced by others. Their role in the network is a one way street. And they are typically well versed in methods of persuasion, so that eventually, every member of the network has adopted their views. This is why a propaganda network like OANN can end up getting millions of people to believe in conspiracy theories like QAnon.

The bottom line is this: scientific skepticism requires that you do the following with your informational networks:

  • Examine each member in detail and make a determination if you rely on them because they affirm already existing opinions or because they present you with science based alternative approaches.
  • Look at their track record over time, and adjust your credence percentage based on their performance.
  • Eliminate, or severely limit the influence of, propagandists who are in your network solely to influence you to adopt a position for which they receive remuneration.

I’ll conclude with an example using myself.

Before the networks called the election for Joe Biden, I was listening to Morning Joe on MSNBC. Joe was clearly impatient with the Decision Desk’s reluctance to call the race, when the majority of provisional ballots from Pennsylvania were Democratic and Biden’s lead kept building. He suggested what was in essence a conspiracy theory, i.e., that Trump had put pressure on the networks not to call it.

For a minute, I bought into that. It made perfect sense to me. Then I wandered over to fivethirtyeight.com and read some commentary by Nate Silver on that topic. He called it utter nonsense and explained why in a very factual way.

I shook my head at myself. Fortunately, my informational network on this subject included someone who was an expert. Also, the credence factor I assign to Nate Silver is around 90%, while it’s less than 50% for Joe Scarborough. I quickly discarded that theory.

It was a good lesson. Here I was, in the middle of writing this series on cognitive fallacies, and I almost subscribed to a conspiracy theory. Denigration of conspiracists may make you feel good, but it doesn’t describe reality. The truth is that constant vigilance, a good information network, and knowledge of what cognitive fallacies are lurking about is imperative for all of us.

It’s impossible, though, to develop a good information network without having at least a passing familiarity with the most common logical fallacies. That’s the subject of the next few posts, based on the work by Dr. Steven Novella – The Skeptics Guide to the Universe.

Leave a Reply

Your email address will not be published. Required fields are marked *

 

This site uses Akismet to reduce spam. Learn how your comment data is processed.