These posts make more sense when read in order.
Please click here for the first article in this series to enter the rabbit hole.
How do we know what we believe is accurate? Some things are easier to test than others. If you see a large rock alongside a road and wonder whether it’s really there, you can try kicking it really hard. But when it comes to things like patriotic beliefs, chances are much of what we believe is influenced by what we were taught as children, or by what our friends and people we admire falsely believe.
For example, if you live in the United States, chances are you believe the Declaration of Independence was signed on the Fourth of July in 1776, which is why the anniversary of that day is a significant holiday here. In actual fact, the Second Continental Congress voted on it on July 2 and it was signed on August 2. The Fourth is celebrated because of a mistake—nothing of significance happened on that day. Celebrating the Fourth also ignores the ninety earlier state and local “declarations of independence”. I’m sure other countries have their myths too.
Some of the things we believe are actually myths, legends, propaganda, old wives’ tales, and superstitions. Lies travel faster, farther, and stick around longer than truth. We all have things we are convinced are real, but have never had a chance to confirm by researching the facts. We don’t usually examine issues deeply, weighing evidence from both sides. That can take too much effort.
AI degradation from training on AI-generated data. M. Boháček & H. Farid/arXiv 2311.12202v1 |
In an interesting study researchers from the United Kingdom and Canada trained an artificial intelligence (AI) app on AI-generated text and images. Without access to authentic data, what the AI system produced became more and more distorted until it was putting out nonsense, which is concerning since AIs are often trained on the Internet, which is beginning to fill with AI-generated text. By the ninth-generation trial, what was supposed to be an article on England’s church towers ended up being about the various colors of jackrabbit tails.[1]
Like us, AI systems take the information they have and collate it into a summary of what it’s going to produce, then it fills in the blanks with assumptions and educated guesses. AIs can even develop prejudices that people had unknowingly allowed to enter the data.[2] Programmers have a very difficult time removing such biases from AI programs, just as we have a hard time removing them from our own thoughts. Sometimes the best we can do is to just be aware that we are all influenced by biases and prejudices and to be on the lookout for our own.
What struck me about this experiment is the similarity between this and how our brains work when they create our impressions of reality. If we don’t actively seek out accurate information, our beliefs, prejudices, and thought processes can easily run amok. Perhaps this is how some people become so thoroughly wrapped up in strange religions, conspiracy theories, and ideas that have no basis in reality and they don’t even realize it. They become so completely convinced based on feedback loops instead of evidence and drift off into a world of lies, propaganda, and fantasy.
Things are not always as they appear to be. Neither are they usually the way we’d like them to be, nor are they the way they should be, nor even the way we think they are.
There are numerous discoveries that go against what we’ve been taught. For example, researchers found that salt doesn’t make you thirsty, as we’ve been told all of our lives. It actually makes you retain water, and it makes you hungry.[3] But it’s still not good to drink a lot of salt water because your nerves can stop functioning properly. Commonly accepted notions and our preconceived beliefs often turn out to be wrong. These challenge our assumptions and what we’ve been taught, but they’re key to gaining a more accurate understanding of the world.
Think Again
How many of these things do you know?
Tomatoes are not vegetables, coconuts are not nuts, pineapples have nothing to do with pines or apples, and sugar doesn’t make kids hyper. Watermelons and eggplants (aubergines) are berries; almonds, pecans, and cashews are seeds; and peanuts are legumes, like beans. Banana plants are herbs that are related to ginger, although the bananas themselves are berries.
Eating before swimming won’t give you cramps. Diet aids don’t detoxify you, your liver and kidneys do, and these aids could cause you harm.[4] Drinking alcohol won’t warm you—it makes your body temperature fall, although you may feel warmer. And it doesn’t kill brain cells either. Waking up sleepwalkers won’t hurt them and might save them from harm.
Toads can’t cause warts and their bumps aren’t warts. It’s papillomaviruses that cause warts in humans. Bats aren’t blind. They see just fine. Ostriches don’t stick their heads in sand. Scientists do understand how bees fly, but it is complicated and reproducible.
Lightning can strike twice in the same place. Ivy helps protect buildings. The direction water swirls down drains has nothing to do with whether you’re in the north or south hemispheres. (The Coriolis effect is too weak to affect water in drains.) Ancient Greek philosophers, beginning with Pythagoras in the 6th century b.c., and the Medieval Europeans knew the earth was round long before Columbus. Even people in Columbus’s time knew it was round before Columbus ever set sail.
The sun does not move across the sky—it just appears that way from our perspective. If you drop a light and a heavy object—say your cell phone and your car—at the same time, they will hit the ground simultaneously.
Humans did not live at the same time as dinosaurs. Dinos went extinct about 65 million years ago and the first humans didn’t appear until around 2.4 million years ago, leaving a roughly 63-million-year gap in between.
Left or right brained? No. Normal people use both sides equally, although areas on one side or the other do specialize in certain tasks. The vacuum of space won’t make you explode. And not everything dies. But you do have to pay taxes, unless you’re a billionaire.
Throughout our lives we’re told by authorities—politicians, teachers, parents, celebrities, newscasters, supposed experts, and others—a lot of things that are questionable. As kids and teens we’re told about the Tooth Fairy, the Easter Bunny, Santa Claus, Krampus, Nessie, Bigfoot, Moth Man, Slender Man, killer clowns, vampires, poltergeists, chupacabras, and the monster under your bed.
Television and social media also bring us a mix of fact and fiction. Most of the fiction is obviously that, but some is disguised as fact. There are even fake documentaries on mermaids, megalodon, extraterrestrials, and other cryptids. Many of the ghost hunter reality shows border on being deceptive. They certainly pretend to be scientific, when they’re not even close. We’re also told about things that are real, but that we’ll never experience ourselves.
There are many people out there who pass on information that they know is false. Perhaps they think it’s funny or just enjoy pulling the wool over someone’s eyes. Others get a feeling of superiority from knowing they fooled someone. And there are those who do it to support something they believe in, figuring it’s for a good cause, even though it’s a lie.
When President Richard Nixon’s lies were exposed, the public was outraged. Now, fifty years later, politicians routinely spew bold-faced lies and their supporters accept it as if it were the truth because it’s what they want to hear, or else they just brush them aside for what they feel is the greater good, ignoring the damage it causes.
During the 4th century b.c. in ancient Greece, Demosthenes said, “What a man desires, he also imagines to be true.” More than one-and-a-half millennia later, in 13th-century Italy, Catholic priest St. Thomas Aquinas, echoed this, saying, “The light of faith makes us see what we believe.” Not quite a millennia after that, it’s still true today. Most people believe what they want to believe. Evidence be damned. It’s why there are still so many fools in the world spouting idiotic ideas. It’s not hard to find some. Just look at politics. As Will Rogers put it, “It isn’t what we don’t know that gives us trouble. It’s what we know that ain’t so.”
University of California, Irvine psychologist Peter Ditto says, “People don’t think like scientists; they think like lawyers. They hold the belief they want to believe and then they recruit anything they can to support it.[...] We almost never think about things without some preference in mind or some emotional inclination to want one thing or another. That’s the norm.”[5]
In one study, experimenters gave people accurate statistics on immigration along the southern border of the United States. They found that the subjects misremembered the figures, molding them to fit their own opinions. As the information is passed along, it moves farther away from the truth. These people unknowingly created false information.[6] Another study found that some people will spread fake news even after they’re told it’s fake.[7]
Perhaps we don’t have as much choice in what we believe as we think we do. Experimenters have found that whether someone is conservative or liberal is 40% determined by genetics. The remainder is because of environment.[8]
Most people have set beliefs and when presented with contradictive facts, they will move their arguments into non-factual territory, such as morality, political belief, or just on principle.[9] Some will argue even more strongly for their beliefs.[10]
Eryn Newman, a cognitive psychologist at the University of Southern California in Los Angeles, said, “I can say broadly when people have strong beliefs about something, it’s difficult to unwind those beliefs, regardless of how strong the evidence is.”[11]
We tend to think our understanding of people, events, and social situations is objective truth, when it’s really our interpretation. This is called “naive realism”.
“We tend to have irrational confidence in our own experiences of the world, and to see others as misinformed, lazy, unreasonable or biased when they fail to see the world the way we do,” explains Matthew Lieberman, a psychology professor at the University of California, Los Angeles. “When others see the world differently than we do, it can serve as an existential threat to our own contact with reality and often leads to anger and suspicion about the others.[...] We believe we have merely witnessed things as they are, which makes it more difficult to appreciate, or even consider, other perspectives.”[12]
Still, beliefs can change. Racists have become anti-racist, liberals have become conservatives, and cult members have escaped their cults. And it’s not always because evidence convinces us we were wrong. A study of debate teams found that the participants came to believe the side of the argument they were assigned was actually right, even though the side was chosen for them at random and it went against their own beliefs. This was in spite of being given incentives for accuracy. Their new-found positions persisted even after exposure to opposing views.[13] In other words, they persuaded themselves to into believing the position that was chosen for them at random. Apparently there are times when other concerns are more important to us than the truth. Sometimes it’s just to fit in with friends and part of one’s group identity, such as belonging to a club or a particular political party.
English novelist George Orwell insightfully wrote that “we are all capable of believing things which we know to be untrue, and then, when we are finally proved wrong, impudently twisting the facts so as to show that we were right. Intellectually, it is possible to carry on this process for an indefinite time: the only check on it is that sooner or later a false belief bumps up against solid reality, usually on a battlefield.”[14]
Believing Impossible Things
“I ca’n’t believe that!” said Alice.
“Ca’n’t you?” the Queen said in a pitying tone. “Try again: draw a long breath, and shut your eyes.”
Alice laughed. “There’s no use trying,” she said: “one ca’n’t believe impossible things.”
“I daresay you haven’t had much practice,” said the Queen. “When I was your age, I always did it for half-an-hour a day. Why, sometimes I’ve believed as many as six impossible things before breakfast.”
—Alice and the White Queen in Through the Looking-Glass, and What Alice Found There, 1871
Viral fake news, has been described as “digital catnip” for people. Often its primary purpose is to get you to click on it, no matter whether it’s exaggerated or untrue. There is so much click-bait, manipulation, and lies out there from so many sources, you can only accept “facts” as possibly true until you can confirm them from a reliable source, but then the problem arises as to what is a reliable source.
Many people have serious trouble with that one, and the sources they think are reliable are anything but, and they are quickly led into an alternate fantasy “reality” by people and corporations with ulterior motives. In addition, psychologists and advertisers know that people more readily accept information, judgments, and opinions from close friends than from anywhere else.
It’s fair to say we all have false beliefs, we just have no idea which of our beliefs are false.
The Anglo-Irish satirist Jonathan Swift wrote in 1710, “Falsehood flies, and the truth comes limping after it.” This is very true. Lies spread very quickly, while the truth doesn’t. I’ve found that people prefer lies to the truth. Lies are simple or basic. They tend to be black and white. The truth is usually complex and nuanced. The truth tends to have gray areas, and because it’s complicated, it doesn’t spread as fast and can be harder to understand.
I’ve found that most people prefer the lies to the truth because lies are created or molded to have resonance with their audience. They fit a narrative. There have been times when I’ve presented people with facts that are undeniable, and yet they simply disregard the facts—pretending they don’t exist. I don’t understand it, perhaps because I place a high value in the truth.
I am constantly casting aside beliefs or making adjustments to them as new information comes to light. I see that as a good thing. It makes my beliefs and ideas more accurate. Many people don’t seem to be able to do that. They seem to find it personally threatening, as if changing a belief is a sign of weakness. Actually it’s holding onto beliefs in spite of the facts that’s a sign of weakness. As economist John Maynard Keynes is quoted as saying, “When I find new information I change my mind; What do you do?”[15]
We all prefer information that is definite, unequivocal, and unambiguous. We don’t want to know that sugar might cause obesity. We want to know for sure. This is what evolutionary biologist and author Richard Dawkins refers to as the tyranny of the discontinuous mind. Things are complicated and obesity has a number of causes. Reality is rarely black and white. Just about everything comes in shades of gray and is usually ambiguous.
Still, there are people who insist on definite answers. For them, I recommend mathematics. Outside of math, you don’t often find that in the real world.
It’s important to examine things from a skeptical and critical perspective. Initially this takes some effort, but it becomes almost second nature over time. Still, it doesn’t mean you won’t get fooled again, but you are less likely to be led astray. That is, as long as you can admit that you were wrong and make the proper adjustments to your thinking and opinions. We need to actively evaluate our sources to make sure they are accurate and are not just feeding us opinions.
This is all part of the scientific method. Yet this doesn’t make you immune. Law and psychology professor Dan Kahan at Yale University found that the better you are at handling scientific information, the easier it is to confirm your own biases and dismiss inconvenient truths that contradict your beliefs. This is known as motivated reasoning and is actually more like a lawyer arguing a case than a scientist trying to discern the truth.
Applying scientific principles objectively tends to bring both conservatives and liberals closer together converging on the facts. Kahan and his associates also found that the scientifically curious were open to reading views that were opposed to their own.[16]
It’s very strange that the creator of Sherlock Holmes, Sir Arthur Conan Doyle, wrote so convincingly of rational reasoning in his fiction, yet had such strong beliefs in the supernatural. He was convinced until the end of his life that the Cottingley fairy photographs showed real fairies captured on film. Many years later the two girls who took the pictures revealed how they did it, using hand-drawn cutouts stuck on hat pins.
Even Nobel-prize winners sometimes end up promoting irrational ideas. Sometimes referred to at the Nobel Disease, this shows that intelligence and scientific brilliance does not protect you from being wrong.[17] And being intelligent can’t protect you from becoming the victim of unsophisticated cons.[18] We may be able to clearly see flaws in the reasoning of others, our bias blind spot prevents us from seeing it in ourselves. It may even be more prevalent in those who are intelligent.[19] In addition, intelligence is not always associated with reasoning abilities.
The famous magician and exposer of fraud, James Randi—also known as The Amazing Randi—wrote, “Don’t be too sure of yourself. No matter how smart or well educated you are, you can be deceived.”[20] This doesn’t just apply to magic...it applies to everything—even your field of expertise.
All of this should give us a little humility, which is something many scientists develop. As I mentioned earlier, the more you learn, the more you realize how much you don’t know.
As people get older they become more confident in what they know, but it is the younger ones who are still learning that notice things and say, “Wait a minute! That’s not right!” It’s the younger scientists who make most of the discoveries. One of the reasons it’s older scientists who receive Nobels is that many years pass between the discoveries and their awards. Theoretical physicist John Wheeler once said of his first PhD student, Richard Feynman, “The reason universities have students is so they can teach the professors”.
Or, as Carl Sagan put it, “In a lot of scientists, the ratio of wonder to skepticism declines in time. That may be connected with the fact that in some fields—mathematics, physics, some others—the great discoveries are almost entirely made by youngsters.”[21]
Click here for the next article in this series:
[1] Elizabeth Gibney, “AI models fed AI-generated data quickly spew nonsense”, Nature, July 24, 2024, https://www.nature.com/articles/d41586-024-02420-7, citing Ilia Shumailov, Zakhar Shumaylov, Yiren Zhao, Nicolas Papernot, Ross Anderson, and Yarin Gal, “AI models collapse when trained on recursively generated data”, Nature, vol. 631, 2024, pp. 755–759, https://doi.org/10.1038/s41586-024-07566-y.
[2] James Zou and Londa Schiebinger, “AI can be sexist and racist — it’s time to make it fair”, Nature, July 18, 2018, https://www.nature.com/articles/d41586-018-05707-8.
[3] Max Delbrück Center for Molecular Medicine in the Helmholtz Association, “Mission control: Salty diet makes you hungry, not thirsty: New studies show that salty food diminishes thirst while increasing hunger, due to a higher need for energy”, ScienceDaily, April 17, 2017, https://www.sciencedaily.com/releases/2017/04/170417182920.htm.
[4] “ ‘Detoxes’ and ‘Cleanses’: What You Need To Know”, National Institutes of Health, https://www.nccih.nih.gov/health/detoxes-and-cleanses-what-you-need-to-know.
[5] Christie Aschwanden, “A User's Guide to Rational Thinking”, Discover Magazine, vol. 36, no. 6, May 27, 2015, pp. 44-49, https://www.discovermagazine.com/mind/a-users-guide-to-rational-thinking.
[6] Ohio State University. “You create your own false information, study finds: People misremember numerical facts to fit their biases”, ScienceDaily, December 9, 2019. www.sciencedaily.com/releases/2019/12/191209080457.htm, citing Jason C Coronel, Shannon Poulsen, and Matthew D Sweitzer, "Investigating the generation and spread of numerical misinformation: A combined eye movement monitoring and social transmission approach", Human Communication Research, 2019, https://doi.org/10.1093/hcr/hqz012.
[7] Asher Lawson and Hemant Kakkar, “Personality Type, as well as Politics, Predicts Who Shares Fake News”, Scientific American, November 12, 2021, https://www.scientificamerican.com/article/personality-type-as-well-as-politics-predicts-who-shares-fake-news/.
[8] Marta Zaraska, “The Genes of Left and Right”, Scientific American Mind, May 1, 2016, https://www.scientificamerican.com/article/the-genes-of-left-and-right/.
[9] Troy Campbell and Justin Friesen, “Why People ‘Fly from Facts’ ”, Scientific American, March 3, 2015, https://www.scientificamerican.com/article/why-people-fly-from-facts/.
[10]Ed Yong, “When in doubt, shout—why shaking someone’s beliefs turns them into stronger advocates”, National Geographic, October 19, 2010, https://www.nationalgeographic.com/science/article/when-in-doubt-shout-why-shaking-someones-beliefs-turns-them-into-stronger-advocates.
[11] Bethany Brookshire, “Sometimes busting myths can backfire”, Science News, February 14, 2016, www.sciencenews.org/blog/scicurious/sometimes-busting-myths-can-backfire.
[12] University of California - Los Angeles. “Why people don't view the world the same way others do”, ScienceDaily, June 9, 2022, https://www.sciencedaily.com/releases/2022/06/220609132011.htm, citing Matthew Lieberman, “Seeing Minds, Matter, and Meaning: The CEEing Model of Pre-Reflective Subjective Construal”, Psychological Review, vol. 129, no. 4, July 2022, pp. 830-872, https://doi.org/10.1037/rev0000362..
[13] Carnegie Mellon University. “Polarized speech: A function of self-persuasion”, ScienceDaily, April 1, 2022, https://www.sciencedaily.com/releases/2022/04/220401122233.htm, citing Peter Schwardmann, Egon Tripodi, and Joël J. van der Weele, “Self-Persuasion: Evidence from Field Experiments at International Debating Competitions”, American Economic Review, vol. 112, no. 4, 2022, pp. 1118-46, https://doi.org/10.1257/aer.20200372.
[14] George Orwell, “In Front of Your Nose,” London Tribune, March 22, 1946, https://www.orwellfoundation.com/the-orwell-foundation/orwell/essays-and-other-works/in-front-of-your-nose/.
[15] James S. Earley, review of The Collected Writings of John Maynard Keynes in Journal of Economic Literature, vol. 17, no. 2, June 1979, p. 540.
See also Quote Investigator, “When the Facts Change, I Change My Mind. What Do You Do, Sir?”, https://quoteinvestigator.com/2011/07/22/keynes-change-mind/.
[16] Dan Jones, “Seeing reason”, New Scientist, no. 3102, December 3, 2016, pp. 28-32, https://www.newscientist.com/article/mg23231020-500-changing-minds-how-to-trump-delusion-and-restore-the-power-of-facts/.
[17] Candice Basterfield, Scott O. Lilienfeld, Shauna M. Bowes, and Thomas H. Costello, “The Nobel Disease: When Intelligence Fails to Protect against Irrationality”, Skeptical Enquirer, vol. 44, no. 3, May/June 2020, https://skepticalinquirer.org/2020/05/the-nobel-disease-when-intelligence-fails-to-protect-against-irrationality/.
[18] Maria Konnikova, The Confidence Game, Edinburgh, UK: Canongate Books, 2016.
[19] R.F. West, R.J. Meserve, and K.E. Stanovich, “Cognitive sophistication does not attenuate the bias blind spot”, Journal of Personality and Social Psychology, vol. 103, no. 3, 2012, pp. 506-19, https://content.apa.org/doiLanding?doi=10.1037%2Fa0028857, June 4, 2012, https://doi.org/10.1037/a0028857.
[20] Philip Ball, “James Randi (1928–2020)”, Nature, vol. 587, October 29, 2020, p. 34, https://www.nature.com/articles/d41586-020-03050-5, https://doi.org/10.1038/d41586-020-03050-5.
[21] Psychology Today staff, “Carl Sagan” (interview), Psychology Today, January/February 1996, p. 30, https://www.psychologytoday.com/us/articles/199601/carl-sagan.