Wednesday, December 4, 2024

Thinking Errors (What is Real? 16)

 

These posts make more sense when read in order.

Please click here for the first article in this series to enter the rabbit hole.

 

Alisdair, CC BY 2.0.

 

Logic! Good gracious! What rubbish! How can I tell what I think till I see what I say?

—an elderly lady after her nieces accused her of being illogical[1]

 

It’s unfortunate that schools don’t teach children how to think. As a result, many, if not most adults don’t know how to use logic or properly analyze arguments. Here’s an example used by psychologist Deanna Kuhn of Columbia University in New York City. She asked high school and college students: which is the stronger statement?

A. Why do teenagers start smoking? Smith says it’s because they see ads that make smoking look attractive. A good-looking guy in neat clothes with a cigarette in his mouth is someone you would like to be like.

B. Why do teenagers start smoking? Jones says it’s because they see ads that make smoking look attractive. When cigarette ads were banned from TV, smoking went down.

Most don’t understand the difference, being unable to realize the first is based on an anecdotal opinion, while the second is supported by a fact.

With low levels of skepticism, many people don’t know how to challenge reasoning, examine assumptions, or uncover biases. This prevents them from discerning how reliable a source is. Some can’t even spot obviously false statements. And they seem not to recognize the techniques advertisers use to say one thing, while giving the impression they’re actually saying something else. Or when they’re covering something up.

Here’s one key thing people should do when evaluating a source: examine the source’s motivations. If they want you to join or remain in a group, what they’re telling you is probably biased to move you in that direction. If they’re trying to sell you something, they are probably avoiding the negative aspects of what they are selling, and might even be lying to you about the item in order to get your money.

I recall someone telling me that RT was a very good source of information. They based this on some RT articles they’d read. I was skeptical, particularly since RT reminded of Russia Today magazine, so I looked it up. This was during the 2020 U.S. presidential election, when Russia was targeting both conservatives and liberals with false information in order to divide the country.

Sure enough, RT is an official Russian government news source that often contains propaganda and outright lies. As a researcher, I always try to check my sources first. Still, that doesn’t prevent me from getting taken in once in a while. And in one survey, nearly nine out of ten people admitted that they had been duped by false news.[2]

In order to protect yourself from political lies, propaganda, and manipulative opinions disguised as news, you have to remember that there are a lot of individuals, groups, organizations, political parties, countries, and companies that want to influence you and get you to believe and do things that are often against your own best interests. They want you to act in their interests—not your own. Some will spend millions of dollars to do it, and they have no problem flat out lying to you in order to get what they want.

One mistake people often make is confuse correlation with causation. For example, a survey in England found that the heaviest drinkers were four times as likely to smoke, so there appears to be a correlation here, but that doesn’t mean, as some people might assume, heavy drinking causes smoking. It could be that smoking causes heavy drinking, or that something else causes both smoking and heavy drinking, or the two could be completely unrelated. More research is needed to find out which. There are types of experiments that can confirm causation, but these are difficult to conduct. Still, even the media jumps to conclusions of causality from research like this that says no such thing. The media also tends to take research conducted on mice and conflate it to apply to humans. We can learn a lot from mice and there are many similarities between us and them, but experimental results often don’t transfer from one to the other.

Another thing to watch out for is when someone accuses somebody of something or disparages them. Before you think of whether or not it actually applies to the accused, consider whether it applies to the accuser. Very often you’ll find that it does. There’s a psychological concept called “projection”, where people subconsciously project their own feelings and guilt onto others. Often they don’t even realize that they are actually describing themselves when they accuse others. Politicians do this all the time and I’ve also seen them knowingly do this to deflect suspicion away from themselves—even when they’re obviously guilty.

Now whenever I hear an unsubstantiated accusation, I usually find it does apply to the accuser and not the accused, and they sometimes act so self-righteous and morally offended when doing it. But the damage is done by the time their revealed to be the guilty party. People still believe their original smears against the innocent party.

On a more subtle level, there’s a cognitive error called framing. This is a way of presenting statistics to make them more appealing. It’s why milk cartons say, “95 percent fat free” instead of “contains 5 percent fat”. We can be easily swayed by how information is presented to us.

Another thinking error is called scenario fulfillment. This is what happened in 1988 when the United States warship USS Vincennes shot down an Iranian Airbus, killing all of its 290 passengers and crew. The radar operators believed the plane, which had just taken off from an airport, was diving to attack the ship, but the radar recording showed the airliner was climbing the entire time. In this thinking error, the evidence is subconsciously made to fit one’s expectations or fears.

This is related to motivated reasoning, which is where someone seeks out information that supports their views. The Iraq War is an example of this. The stated justification for the U.S. invasion of Iraq was that the country had weapons of mass destruction. It was later discovered they had none. There was also the perception that Iraq had something to do with the 9/11 attacks on New York’s Twin Towers, which was obviously false to anyone familiar with the situation. The attacks were carried out by religious fanatics who were the polar opposite of Iraq’s secular dictatorship. The problem was that the Bush administration had preconceived ideas of what was happening and it promoted intelligence that supported those ideas, while suppressing evidence that went against their beliefs.

While this is an extreme example, motivated reasoning is not unusual and can be found in our daily lives, although those doing it aren’t aware they’re doing it.

In another deceptive trick, we can be influenced by something and think that we weren’t influenced. In the year 2000, 76% of the new drugs approved by the FDA were only a minimal to moderate improvement on existing drugs, but the pharmaceutical companies were charging twice as much for them. Part of the reason for this was they were spending $8,000 on each doctor to promote their drugs. In a survey, the doctors said they thought 84% of their fellow doctors were influenced by the freebies, but only 16% thought that they themselves were.[3]

Advertisers rely on this. They know that even if you ignore their ads, you can still be influenced subconsciously and you won’t be aware of it.

Next is the sunk cost fallacy. When you’ve invested a lot into an endeavor or situation, it’s very difficult to leave it behind and extract yourself. If you’ve spent years building up a career, but find yourself hating it, it’s hard to quit and move on to something new. If you’ve invested yourself in a set of beliefs and are part of a community with the same beliefs, it’s very difficult to leave it behind. If you’re an investor who has plowed a lot of money into shares that have tanked, it’s tough to sell at a loss, since they might eventually rise again. When we’ve put time and effort into a losing proposition, we often continue investing in the hope that things will turn around and we can recoup our losses.

It’s because of this fallacy that you tend to watch a bad film to the end, even though you knew it was bad halfway through; or you stay in a bad relationship, hoping things will improve; or you plow more money into your failing car. It’s why a company will continue investing in a nuclear reactor in spite of cost overruns, and continues to do so until the company goes bankrupt. Or why the government pushes failed programs way past their due dates, such as prohibition and the war on drugs, which not only failed to curb drug and alcohol use, but led to the tremendous rise of organized crime that still plagues society and has filled the prisons with users and minor dealers.

Because of sunk costs, we tend to continue even when it seems futile, rather than abandoning our investment as a loss. But, as they say, “don’t throw good money after bad.” We all suffer from this one. And don’t feel bad, monkeys are susceptible to it too.[4]

There are many ways in which people lie to themselves. Some people who lie to others end up convincing themselves of their own lies.[5] Another form is confabulation, which is common in all of us, although it’s a lot more obvious in amnesiacs, those with dementia, and split-brain patients who have had the connections between the two hemispheres of their brain severed in order to reduce severe epilepsy. This error is where our minds fill in gaps in our memory or knowledge.

Experiments in this area have found that when our subconscious makes decisions and we don’t know why, we fabricate stories to explain it. The area of our brains where we analyze data and create explanations for it, is very good at creating scenarios, but neurophysiologists found it is usually wrong—even in our justifications for why we did something. This analytical center often generates false explanations when we are called upon to explain ourselves after the fact. These narratives may be logical and make perfect sense, but they’re usually false.

Our brains are better at winning arguments, both with others and with ourselves, and they’re not as good with analysis and applying logic. It’s part of our rationalization process, how we connect the dots when information is missing, and how we piece together narratives. We’ve evolved to try to make sense of our world, but often these attempts bring us to the wrong conclusions.[6] It’s like the just-so stories of ancient cultures that explain natural phenomena with myths.

Recent research suggests that we don’t really know why we’re depressed or even why we’re angry. We often don’t know why we do something or believe something or like or dislike something, so our brains make up reasons.

How much does all of this affect us? Psychologist Dan McAdams of Northwestern University in Evanston, Illinois, who has been studying this for decades, doubts whether we really know why we do anything, so we create personal myths that we tell ourselves and others.[7]

 

If you like this, please subscribe below to receive an email the next time I post something wondrous. It's free.
 

Click here for the next article in this series:

Mind Craft



[1] E. M. Forster, Aspects of the Novel, New York: Harcourt, Brace & Co., 1927, p. 152.

[2] “Fake News: A Global Epidemic Vast Majority (86%) of Online Global Citizens Have Been Exposed to it”, Ipsos, June 11, 2019, https://www.ipsos.com/en-us/news-polls/cigi-fake-news-global-epidemic.

[3] Joseph T. Hallinan, Why We Make Mistakes, New York: Broadway Books, 2009, pp. 70-71.

[4] Georgia State University. “Monkeys, like humans, persist at tasks they’ve already invested in: Studying this phenomenon in animals can teach us about how their minds—and ours—work, the researchers said.”, ScienceDaily, December 18, 2020, www.sciencedaily.com/releases/2020/12/201218112531.htm, citing Julia Watzek and Sarah F. Brosnan, “Capuchin and rhesus monkeys show sunk cost effects in a psychomotor task”, Scientific Reports, 2020; 10 (1), https://doi.org/10.1038/s41598-020-77301-w.

[5] Ed Yong, “People don’t know when they’re lying to themselves”, Discover Magazine, March 7, 2011, https://www.discovermagazine.com/mind/people-dont-know-when-theyre-lying-to-themselves, and at National Geographic, March 7, 2011, https://www.nationalgeographic.com/science/article/people-dont-know-when-theyre-lying-to-themselves, citing Chance, Norton, Gino, and Ariely, "Temporal view of the costs and benefits of self-deception", PNAS, 2011, https://doi.org/10.1073/pnas.1010658108.

[6] Check out the work of Michael Gazzaniga of the University of California, Santa Barbara, such as Michael S. Gazzaniga, Who’s in Charge? Free Will and the Science of the Brain, New York: Ecco, 2011.

[7] Dan P. McAdams, Faculty Profile, Northwestern University, https://sites.northwestern.edu/danmcadams/.

Add your comment here

Name

Email *

Message *