These posts make more sense when read in order.
Please click here for the first article in this series to enter the rabbit hole.
Yaohnanen tribesmen of Vanuatu hold pictures from Prince Philip, Duke of Edinburgh's 2007 visit. Christopher Hogue Thompson, CC BY-SA 3.0, (adjusted). |
We all know of the connection between hallucinations and mental illness, but I’d like to point out that there’s a big difference between hallucinations and delusions. Delusions are a much stronger sign of mental illness. The American Psychiatric Association defines delusions as “fixed false beliefs that are not widely shared by members of the individual’s culture or subculture”.[1] Delusions can range from the mild—“a vague sense that the world is out to get you”—to the major “my neighbor’s dog just told me to cut you into little pieces.”
This is the current definition is one of several indications used by psychiatrists when diagnosing mental illness. To avoid significant controversy, it sidesteps the issue of false beliefs that are widespread in cultures. There are two Bolivian villages where the residents worship and pray to the Marxist atheist revolutionary Che Guevara. They admired him for helping the poor and ended up turning him into a saint.[2] There are also the so-called “cargo cults” of the South Pacific that prophesize the arrival of airplanes that will deliver Western goods to them if they perform certain rituals. They are also convinced that Britain’s Prince Philip or an American serviceman named John Frum is a god.
These beliefs obviously make sense to those who believe them, given their experiences, what they’ve been told, and their limited knowledge of Western culture. To us these are obviously false, yet they’re not much different than many other false beliefs that are rife throughout all cultures. The line between false beliefs and delusions is a vague one.
Of course, the number of people who hold a particular belief has nothing to do with its validity. Facts cannot be determined by votes. No doubt the majority of people in the world believe that vision gives us an accurate view of our surroundings, but the previous chapters should have caused you to question that belief.
British philosopher, logician, mathematician Bertrand Russell wrote in 1925, “We all have a tendency to think that the world must conform to our prejudices. The opposite view involves some effort of thought, and most people would die sooner than think—in fact they do so”.[3]
The conclusions and judgements that come to us almost automatically— intuition and gut feelings—are often wrong and sometimes very wide of the mark. The same applies to many of the things we’ve been taught, yet we tend to cling to them. We originally accepted them as real and it takes a lot of thought and consideration to dispel them. Sometimes it takes active research. But usually we don’t have time for that.
In On Being Certain, Robert Burton—who was a head of the Department of Neurosciences at the University of California San Francisco Mount Zion Hospital—presents current research which shows that our feelings of certainty are a sensation generated by our minds and are largely beyond our control. While we think it’s a product of knowledge and reason, studies are finding that certainty is actually generated by ancient brain areas that are uninfluenced by logic, reason, or conscious thought. The feeling is so strong that we’re convinced we’re right...even when we’re wrong. Burton wrote that “the feelings of knowing, correctness, conviction, and certainty aren’t deliberate conclusions and conscious choices. They are mental sensations that happen to us.”[4]
People often have difficulty understanding this, but it’s something scientists constantly have to deal with. Many people just accept their convictions at face value. Scientists need to question them. They know that any idea can be proved wrong, but nothing can be proved right. There’s always wiggle-room for doubt.
Of course there are things like the second law of thermodynamics, much of quantum mechanics, and evolution are supported by such a huge body of evidence that there’s essentially no doubt they are wrong. Yet there’s a nearly nonexistent chance that a discovery could alter that view. It’s impossible to rule that out. But it would take something really drastic, such as discovering our entire universe is a computer simulation.
Other findings are on shakier ground, so scientists tend to question and reexamine their convictions. Science adjusts to fit the evidence. This is strange for many people, making them feel like they’re standing on shifting sands. That’s just an illusion. Scientists are actually standing on firmer ground than those people are. Those who clutch at false convictions are actually sinking in quicksand without realizing it.
It’s very difficult to question our convictions since our brains reward us for having them. Burton explained that “the brain has provided us with a wide variety of subjective feelings of reward ranging from hunches, gut feelings, intuitions, suspicions that we are on the right track to a profound sense of certainty and utter conviction. And yes, these feelings are qualitatively as powerful as those involved in sex and gambling. One need only look at the self-satisfied smugness of a ‘know it all’ to suspect that the feeling of certainty can approach the power of addiction.[...]
“At bottom, we are pattern recognizers who seek escape from ambiguity and indecision. If a major brain function is to maintain mental homeostasis, it is understandable how stances of certainty can counteract anxiety and apprehension.”[5]
The Effects of Radiation in Outer Space
One day a schizophrenic patient came up to me and very seriously whispered in confidence, “I want to tell you a secret. Radiation in outer space can make you live forever.” The following day I asked him what would happen if you were exposed to radiation in outer space and he instantly exclaimed, “It’ll fry you!” He seemed completely convinced of both statements when he made them, as schizophrenics often are.
While our beliefs rarely change from day to day, they do change over time, and sometimes they even conflict with one another. As a society we treat our captive pets with loving kindness, and yet we eat massive quantities of meat. So much that there are factory farms to keep up with demand and fish are rapidly being depleted from the oceans. Or someone may support free speech, but wants to censor hate speech. Or be against abortion or physician-assisted suicide, but in favor of the death penalty or military actions that result in civilian deaths. And then there’s the belief that wars will lead to peace.
In 2013 a study found that most farmers surveyed in Mississippi, North Carolina, Texas, and Wisconsin didn’t believe in climate change, yet the majority believed that climate change would put some farmers out of business and the remainder would be forced to change their business practices.[6] Even beliefs that are formed with careful consideration sometimes fail to include related questions.
We are all very busy, with many things on our minds, so we’re used to relying on gut feelings, intuition, and what others have told us, sometimes to our detriment. We have no choice. There’s no time to check into everything. But some people are more likely to rely on these shortcuts in reasoning than others. Researchers found that those who do make these jumps are more likely to choose a bet that has a low chance of winning over one with a higher chance. While comparing the impulsive with those who paused for consideration, they found both were influenced by automatic responses that were often contaminated by biases and other mental flaws, but the non-impulsive could overcome mental contaminants with analysis. The researchers found that for the impulsive, skipping this step was “connected to their problematic beliefs and faulty reasoning.”[7]
Now check out this simple question: “A bat and a ball cost $1.10. The bat costs $1.00 more than the ball. How much does the ball cost?” What would you say is the answer?
The intuitive, gut reaction answer is 10 cents, but you shouldn’t bet on it. If you take the time to think about it, you’ll realize the correct answer is five cents. That bat has to cost a dollar and five cents to cost a dollar more than a five-cent ball. This is one of the questions researchers use to help determine impulsivity.
Other mental shortcuts we use to save time and effort include stereotypes and prejudices, both of which can be harmful to ourselves and others. While mental shortcuts are quick and easy, they make us susceptible to fake news, lies, propaganda, and conspiracy theories. But examining opposing views and researching outside of one’s chosen information sources can be revealing. If anything, it helps to better understand the other side’s point of view.
While relying on quick impressions can cause trouble, there are instances where gut reactions can serve us well. These are situations that are too complex and can cause overthinking. In these cases, if a fast decision is needed, then you’re best leaving it to your subconscious to decide and going with your gut instinct.[8]
Researchers have found that our opinions, beliefs, and our actions are often determined by subtle influences that we’re not aware of. For example first impressions are much more powerful than we think.
In one experiment people were shown photos of politicians they didn’t know and were asked their first impressions. They were then told about the politicians’ competence and positions. The people learned about the politicians the way they normally would during a campaign. While this diluted their initial impression somewhat, their final votes generally matched their first impressions. Of course they weren’t aware of this and thought they had made rational decisions.[9] In addition, psychologist Alexander Todorov of Princeton University has found that our first impressions are usually wrong.[10]
Other studies have shown that most of us are primarily influenced by a politician’s appearance. First impressions or snap judgments also have a long-term impact on our opinion of someone. In one study Todorov and his colleagues were able to accurately predict election results about 70% of the time based on appearance alone.[11] Another study used photographs of fictional candidates and swapped their names, political party, experience, and qualifications. They found people voted based solely on appearance 59% of the time.[12] In the 1960 election, people who heard the presidential debate on the radio judged Richard Nixon the winner, but those who watched it on television thought John F. Kennedy was the winner. Radio listeners heard Kennedy’s high-pitched voice with its New England accent and preferred Nixon’s low voice, while TV viewers saw Kennedy’s robust, tan image and Nixon’s haggard, pasty image. Some think it was these superficial differences that decided the election.
We tell ourselves our votes are based on the candidate’s qualifications and policies, but that’s just our conscious mind trying to justify its subconscious decisions. Since this is all subconscious, there’s no way for us to know it’s happening. We can make a decision without even knowing we’ve made one.
This fits with the idea that we’ve evolved to make rapid decisions with minimal information. For early humans in a dangerous environment, quick decisions were sometimes required and one wrong one could end their lives. This seems to be the reason that immediate decision making—not analytic reasoning—has an oversized influence.
Our feelings can also be deceptive. We try to make sense of things by comparing input with similar experiences from our past and then interpret it using their best guesses and estimations. Our feelings are based on a stream of sometimes incomplete information and guesses. This works fairly well most of the time, but not always.
Click here for the next article in this series:
[1] American Psychiatric Association, Diagnostic and Statistical Manual of Mental Disorders, Fifth Edition (DSM-5). Washington, DC: American Psychiatric Association, 2013.
[2] Harry C. Triandis, “Self-deception: An Introduction”, Acta de Investigación Psicológica, vol. 3, no. 2, 2013, pp. 1069-1078, https://www.sciencedirect.com/science/article/pii/S2007471913709529, https://doi.org/10.1016/S2007-4719(13)70952-9.
[3] Bertrand Russell, The ABC of Relativity, New York: Harper & Brothers, 1925, p. 166.
[4] Robert A. Burton, On Being Certain, New York City: Macmillan Publishers/St. Martin’s Press, 2008.
[5] Jonah Lehrer, “The Certainty Bias: A Potentially Dangerous Mental Flaw”, Scientific American, October 9, 2008, http://www.scientificamerican.com/article.cfm?id=the-certainty-bias.
[6] Dan Jones, “Seeing reason”, New Scientist, no. 3102, December 3, 2016, pp. 28-32, https://www.newscientist.com/article/mg23231020-500-changing-minds-how-to-trump-delusion-and-restore-the-power-of-facts/.
[7] Carmen Sanchez and David Dunning, “Leaps of Confusion”, Scientific American, vol. 326, no. 2, February 2022, pp. 68-71, and as “People Who Jump to Conclusions Show Other Kinds of Thinking Errors”, https://www.scientificamerican.com/article/people-who-jump-to-conclusions-show-other-kinds-of-thinking-errors/.
[8] Madeleine Finlay, “A bat and a ball together cost $1.10. The bat costs $1 more than the ball. How much does the ball cost?”, New Scientist, no. 3204 , November 17, 2018, pp. 38-41, and as “We’ve got thinking all wrong. This is how your mind really works”, November 14, 2018, https://www.newscientist.com/article/mg24032040-300-weve-got-thinking-all-wrong-this-is-how-your-mind-really-works/.
[9] Joseph T. Hallinan, Why We Make Mistakes, New York: Broadway Books, 2009, p. 69.
[10] Kate Douglas, “Facing up to first impressions”, New Scientist, no. 3138, August 12, 2017, and as “The enigma of the face in forming first impressions”, August 9, 2017, https://www.newscientist.com/article/mg23531380-800-the-enigma-of-the-face-in-forming-first-impressions/.
[11] Alexander Todorov, Anesu N. Mandisodza, Amir Goren, and Crystal C. Hall, “Inferences of Competence from Faces Predict Election Outcomes”, Science, no. 5728, June 10, 2005, pp. 1623-26, https://www.science.org/doi/10.1126/science.1110589, https://doi.org/10.1126/science.1110589.
[12] Shawn W. Rosenberg Lisa Bohan, Patrick McCafferty and Kevin Harris., “The Image and the Vote: The Effect of Candidate Presentation on Voter Preference,” American Journal of Political Science 30, no. 1 (February 1986): 108–27.