Stop for a second and answer these simple questions:
In the Bible, which animal swallowed Jonah?
How many animals of each type did Moses bring into the ark?
If you are like most people, you will have answered "whale" to the first question and "two" to the second. Very few people realize that it was not Moses, but Noah, who built the ark, according to the Bible.
This phenomenon is known as the "Moses Illusion" and has profound implications in daily life because it reflects our inability to detect errors in the world. Even if we know the correct information, we have a tendency to overlook mistakes.
Blindness to errors
In 1981, two psychologists, Thomas D. Erickson of the University of California and Mark E. Mattson of the State University of New York, found that 80% of people did not notice the error in the questions.
The curious fact was that the participants did not notice the error even when they were warned that some questions could be wrong or when the time pressure was eliminated, so that they could think more calmly.
Duke University psychologists took it a step further by replicating the experiment, but highlighting some important data in red that participants needed to evaluate more carefully. The results were disastrous.
Most people not only continued without noticing the error, but in a subsequent test they included the incorrect data in their answers, indicating that they had incorporated it into their worldview.
The point is that a few days before the test the psychologists had evaluated their knowledge and were correct. This means that while we apparently don't fixate on the wrong details, our minds take notice and incorporate them into our knowledge system.
Everything is true until proven otherwise
We all think we are smart and that if we see an error or false information, we will notice it and not believe it. But in reality, we can all be deceived. Moses' illusion is based on the way we process information.
Spinoza hypothesized that when we are faced with an idea, instead of following a logical evaluation path to accept or reject it, we automatically accept it. Rejection would be a second step that requires more cognitive effort.
Science confirms his hypothesis. Researchers at the University of Texas asked a group of people to become judges by indicating the sentence to sentence two criminals who had committed a crime. The "trap" was that police reports contained true and false statements, each in different colors.
Although participants were warned that the reports contained false data and were told what they were, these suggested nearly twice as many years in prison when false statements exacerbated the severity of the crime. This shows that, initially, we assume what we read or hear as true and only after reflection can we classify it as false.
Why are we positively biased?
The default theory of truth
We are all prone to what is known as "truth bias," which occurs regardless of the source of the information or previous knowledge we have.
Based on the Truth-Default Theory (TDT), we always assume that others are honest. We do not think of deception as a possibility in communication until we have clues that make us doubt. In fact, a University of Alabama study indicates that our accuracy in detecting lies is less than 50%.
The initial tendency to regard statements as true is likely a propensity to facilitate communication. After all, it is much easier to assume that the person in front of us is telling us the truth than to go through a "lie detector" everything he says.
In fact, we do not fall into Moses' delusion when the information is patently wrong. Northwestern University psychologists have found that we have less confidence in unbelievable inaccuracies than plausible ones. So what if they asked us "How many animals of each type did Kennedy bring on the ark?" we would have noticed the error immediately. The problem is when the information is plausible.
Is it possible to escape the illusion of Moses?
Having more experience or greater knowledge of certain topics will allow us to be better prepared to detect errors, falsehoods and misinformation. A study conducted at Duke University, for example, found that history students better detect historical errors than biology students and vice versa. However, prior knowledge is not enough because we don't use it many times.
An experiment conducted at Vanderbilt University found that the most effective way to reduce the illusion of Moses is to act as if we were verifying the facts. In other words, take a critical attitude from the start and check all the information.
It is a considerable cognitive effort, but activating our critical thinking is the only way to protect ourselves from manipulation, deception and misinformation.