APP下载

You’re Extremely Gullible人类太好骗

2020-04-10莉萨·法齐奥

英语世界 2020年3期
关键词:摩西误导方舟

莉萨·法齐奥

Why humans stink at finding falsehoods.人類为何拙于发现谎言。

Heres a quick quiz for you:

· In the biblical story, what was Jonah swallowed by?

· How many animals of each kind did Moses take on the Ark?

Did you answer “whale” to the first question and “two” to the second? Most people do… even though theyre well aware that it was Noah, not Moses who built the ark in the biblical story.

Psychologists like me call this phenomenon the Moses Illusion. Its just one example of how people are very bad at picking up on factual errors in the world around them. Even when people know the correct information, they often fail to notice errors and will even go on to use that incorrect information in other situations.

Research from cognitive psychology shows that people are naturally poor fact-checkers and it is very difficult for us to compare things we read or hear to what we already know about a topic. In whats been called an era of “fake news,” this reality has important implications for how people consume journalism, social media, and other public information.

Failing to notice what you know is wrong

The Moses Illusion has been studied repeatedly since the 1980s. It occurs with a variety of questions and the key finding is that—even though people know the correct information—they dont notice the error and proceed to answer the question.

In the original study, 80 percent of the participants failed to notice the error in the question despite later correctly answering the question “Who was it that took the animals on the Ark?”

The Moses Illusion demonstrates what psychologists call knowledge neglect1—people have relevant knowledge, but they fail to use it.

One way my colleagues and I have studied this knowledge neglect is by having people read fictional stories that contain true and false information about the world. For example, one story is about a characters summer job at a planetarium. Some information in the story is correct: “Lucky me, I had to wear some huge old space suit. I dont know if I was supposed to be anyone in particular—maybe I was supposed to be Neil Armstrong, the first man on the moon.” Other information is incorrect: “First I had to go through all the regular astronomical facts, starting with how our solar system works, that Saturn is the largest planet, etc.”

Later, we give participants a trivia test2 with some new questions (Which precious gem is red?) and some questions that relate to the information from the story (What is the largest planet in the solar system?). We reliably find positive effects of reading the correct information within the story—participants are more likely to answer “Who was the first person to step foot on the moon?” correctly. We also see negative effects of reading the misinformation—participants are both less likely to recall that Jupiter is the largest planet and they are more likely to answer with Saturn.

These negative effects of reading false information occur even when the incorrect information directly contradicts peoples prior knowledge. In one study, my colleagues and I had people take a trivia test two weeks before reading the stories. Thus, we knew what information each person did and did not know. Participants still learned false information from the stories they later read. In fact, they were equally likely to pick up false information from the stories when it did and did not contradict their prior knowledge.

Can you improve at noticing incorrect info?

So people often fail to notice errors in what they read and will use those errors in later situations. But what can we do to prevent this influence of misinformation?

Expertise or greater knowledge seems to help, but it doesnt solve the problem. Even biology graduate students will attempt to answer distorted questions such as “Water contains two atoms of helium and how many atoms of oxygen?”

Many of the interventions my colleagues and I have implemented to try to reduce peoples reliance on the misinformation have failed or even backfired. One initial thought was that participants would be more likely to notice the errors if they had more time to process the information. So, we presented the stories in a book-on-tape format and slowed down the presentation rate. But instead of using the extra time to detect and avoid the errors, participants were even more likely to produce the misinformation from the stories on a later trivia test.

Next, we tried highlighting the critical information in a red font. We told readers to pay particular attention to the information presented in red with the hope that paying special attention to the incorrect information would help them notice and avoid the errors. Instead, they paid additional attention to the errors and were thus more likely to repeat them on the later test.

The one thing that does seem to help is to act like a professional fact-checker. When participants are instructed to edit the story and highlight any inaccurate statements, they are less likely to learn misinformation from the story. Similar results occur when participants read the stories sentence by sentence and decide whether each sentence contains an error.

Its important to note that even these “fact-checking” readers miss many of the errors and still learn false information from the stories. For example, in the sentence-by-sentence detection task participants caught about 30 percent of the errors. But given their prior knowledge they should have been able to detect at least 70 percent.

Quirks of psychology make us miss mistakes

Why are human beings so bad at noticing errors and misinformation? Psychologists believe that there are at least two forces at work.

First, people have a general bias to believe that things are true. (After all, most things that we read or hear are true.) In fact, theres some evidence that we initially process all statements as true and that it then takes cognitive effort to mentally mark them as false.

Second, people tend to accept information as long as its close enough to the correct information. Natural speech often includes errors, pauses, and repeats. (“She was wearing a blue—um, I mean, a black, a black dress.”) One idea is that to maintain conversations we need to go with the flow—accept information that is “good enough” and just move on.

And people dont fall for these illusions when the incorrect information is obviously wrong. For example, people dont try and answer the question “How many animals of each kind did Nixon take on the Ark?” and people dont believe that Pluto is the largest planet after reading it in a fictional story.

Detecting and correcting false information is difficult work and requires fighting against the ways our brains like to process information. Critical thinking alone wont save us. Our psychological quirks put us at risk of falling for misinformation, disinformation and propaganda. Professional fact-checkers provide an essential service in hunting out incorrect information in the public view. As such, they are one of our best hopes for zeroing in on errors and correcting them, before the rest of us read or hear the false information and incorporate it into what we know of the world.

做个快速测试吧:

·《圣经》故事中,什么吞了约拿?

·摩西带到方舟上的动物,每种带了多少?

你的答案是否分别是“鲸”和“两个”?大多数人都会这么回答吧……虽然明知《圣经》故事中建造方舟的是诺亚而不是摩西。

我是研究心理学的,这种现象在心理学中被称为“摩西错觉”。这仅仅是人拙于发现身边事实性错误的例证之一。即使人们知道正确的信息,也常常察觉不到错误,甚至还会在其他场合沿用错误信息。

认知心理学的研究表明,人类天生就不善于核实,也很难将耳闻目见的信息与已有认知进行比较。在这个人们认为“假新闻”泛滥的年代,这一发现对于了解大众如何阅读新闻、社交媒体及其他公众信息有着重要的启示。

认知错误而不自知

20世纪80年代以来,“摩西错觉”一直是人们反复研究的对象。研究采用的问题各种各样,而重要的发现就是:虽然人们知道正确的信息,却并未察觉到题目中的错误,仍会继续回答问题。

在开头测试涉及的研究中,80%的参与者未能注意到第二个问题中的错误,虽然后续被问及“谁把動物带上了方舟”这一问题时,这些人都回答对了。

“摩西错觉”证实了心理学家所说的“知识忽略”,即人们虽然具备相关知识,却未能调用。

我和同事在研究“知识忽略”时,采用的一种方法便是让参与者读虚构小说,小说提及的现实世界的信息有真有假。例如,有个故事讲主人公暑假在天文馆打工。故事中,有些信息是正确的,比如:“幸运如我,不得不穿着肥大、陈旧的航天服。我不清楚自己是不是在扮演某个航天员——或许就是登月第一人尼尔·阿姆斯特朗。”有些信息则是错误的,比如:“首先,我得过一遍天文常识,从太阳系的运行开始,诸如土星是太阳系中最大的行星。”

接着,我们对参与者进行了知识测试,其中包含一些新问题(什么宝石是红色的?)以及一些跟故事所交代信息相关的问题(太阳系最大的行星是什么?)。结果表明,故事中的正确信息確实发挥了积极作用,对于“谁第一个登上了月球?”这样的问题,参与者的正确率较高。然而,我们也注意到阅读错误信息的负面影响,参与者往往不太容易想起木星是太阳系最大的行星,而更可能答土星。

即使错误信息与人们此前掌握的知识相左,阅读错误信息依然会产生负面影响。例如,在一项研究中,我和同事安排参与者先进行知识测试,两周后,再读小说故事。这样,我们就知道每个人之前知道哪些信息,不知道哪些信息。然而,参与者还是记住了后续所读故事中的错误信息。事实上,无论故事中的信息与其先前掌握的知识是否相悖,参与者都同样可能受到错误信息的误导。

能否提高发现错误信息的能力?

总之,人们常常发现不了所读内容中的错误,而且后续还会把这些错误信息用在其他地方。那我们怎样才能避免受错误信息的误导呢?

了解专业知识或扩大知识面看似有帮助,但并不能解决问题。即使是生物专业研究生,面对“水分子包含两个氦原子和多少氧原子”这类错误问题时,也会试图回答。

为了减少人们对错误信息的依赖,我和同事尝试了各种干预措施,其中很多要么失败,要么适得其反。最初,我们认为如果参与者有更多的时间处理信息,就更可能留意到错误。于是,我们采用有声书的形式播放故事,并放慢了播放速度。然而,参与者并未将增加的时间用于甄别错误、避免犯错,在后续测试中,他们受故事中错误信息误导的几率反而有增无减。

接着,我们又尝试将关键信息以红色字体突出显示,并提醒参与者特别注意红字部分,希望对错误信息的特别关注可以帮助他们发现错误、避免犯错。结果,由于特别留意了错误信息,参与者在后续测试中反而更容易重复那些错误。

唯一看起来有用的方法就是让参与者扮演专业的核查员。我们要求参与者对故事进行编辑,标注出任何失实的陈述,结果他们答题时受故事中错误信息误导的几率降低了。另外,让参与者逐句阅读故事,确定每句是否有错,也收到了同样的效果。

需要注意的是,即便是这些“核查员”,也漏掉了许多错误,仍然会受故事中错误信息的误导。例如,在逐句检查任务中,参与者找出了约30%的错误,但鉴于其先前掌握的知识,其错误识别率本应至少达到70%。

心理倾向导致人类忽视错误

为什么人们甄别错误和虚假信息的能力如此之差?心理学家认为至少有两点原因。

首先,人们一般都倾向于认为所见所闻是真的(毕竟,其中多数都是真的)。其实,有些证据表明,我们的大脑最初是将所有信息视为真实的,之后经过一番认知努力识别出其中的错误信息。

其次,只要信息显得跟正确的差不多,人们就倾向于接受。口语中,人们常常出现错误、停顿和重复。(“她穿着件蓝色……呃,不对,是黑色,黑色的连衣裙。”)有观点认为,为了推进对话,人们需要顺其自然——只要信息“还算过得去”就可以继续往下聊。

如果信息存在明显错误,人们则不会陷入错觉。例如,如果问题换成“尼克松带到方舟上的动物,每种带了多少”,回答者就不会落入圈套,人们也不会相信虚构小说中所写的“冥王星是太阳系中最大的行星”。

甄别、纠错并非易事,这要求我们对抗大脑本能的信息处理机制。光靠批判性思考无法让我们脱困。种种心理倾向导致我们很容易受错误信息、虚假信息和政治宣传的影响。专业核查员可以为找出公众视野中的错误信息提供必要的服务。因此,这些人是最有希望揪出并纠正错误的人选之一,可防止其他人读到或听到错误信息并将其代入自己对现实世界的既有认知。                      □

(译者单位:上海大学)

猜你喜欢

摩西误导方舟
洪方舟《战疫·长城》 《西凉鼓乐图》
得到的都是笑脸
林露露,你就是童话里那个天使
Hiker climbs over 2, 000 metres to hand out free water 徒步旅行者翻越2, 000米送免费用水
Recent Promotion and Commercialization of Kun Opera
慌乱的父亲
误导牌
我真的虚伪吗
摩西奶奶
误导