为什么聪明反被聪明误 Microsoft Office Word 文档
初二 议论文 9277字 78人浏览 yi定要幸福0

为什么聪明反被聪明误

发布时间:2012-06-13

有一个很简单的数学题:一个球拍与一个球总共1美元10美分。球拍比球要贵。那么球是多少价格?大多数人会马上且很自信地回答,说球的价格是10美分。很显然这个答案是错的。(正确的应该是球:5美分,球拍:1美元5美分。)

50多年来,诺贝尔奖得主,普林斯顿大学心理 学教授 Daniel Kahneman问了很多人这个相同的问题,并且对人们的回答做了分析。他看是简单的实验其实很大程度上改变了我们的思维方式。几百年来,哲学家,经济学家,社会科学家认为,人类是一种理性动物---推理是人类才会拥有的天赋----Kahneman 与同事Amos Tversky通过调查分析发现,人类其实并不是想我们所想的那样很有理想。

当人们面临不确定因素是,他们不会很仔细地去分析那些信息数据,或查找相关的数据。相反,他们通过自己能想到的种种捷径去做判断,而最终的结果是表明他们是错误的。这些捷径并不是能快速解答数学题的方法;而是避开正常的数学计算方法。当问到球拍与球的问题是,我们忘记数学课上学到的东西,而是想当然的给出了答案。

虽然Kahneman 已经被公认为20世纪最有影响力的心理学家之一,好几年来,他的文章却没有引起重视。Kahneman 说,一位很有名的美国哲学家,听了他的研究后,马上摇头说,“我对研究愚蠢的心理学没有兴趣."

后来那位哲学家就应该后悔了。詹姆斯. 麦德森大学的Richard West与多伦多大学的 Keith Stanovich共同合作的一篇论文最近发表在《性格与社会心理学》期刊上,文章里说,很多事实都证明聪明的人更倾向于错误的思维范式。虽然我们都认为智慧是没有偏见的----这就是为什么很多SAT 考试得高分的人觉得他们不大会范那些常识性的错误,人们对此往往是意想不到的。West 与他的同事们让482位大学生做一份问卷,上面都是各种常见的带有个人偏见性的问题。下面是几个例题:

湖里有一丛百合花。每天百合花丛的面积都会翻倍。如果百合花丛能盖满整个湖面,那么需要48天,那么花丛要盖满半个湖面需要多少天?

你可能马上就想通过捷径找到答案,直接把48天除以2,然后回答说是24天。可是这个答案是错的。正确的应该是47天。

West 又让学生们做了一份智力测试,意在测试人们的“潜在偏见”存在的程度到底是怎样的,这个测试Kahneman 与 Tversky 在上世纪70年代时也做过。第一个问题是世界上最高的红杉树高度是否多于几英尺,而这个几英尺的数据范围从86英尺到1000英尺。第二个问题是估计一下世界上最高的红杉树的高度到底是多高。学生们的回答大多倾向与数字小的答案---如85英尺,世界上最高的红杉树的高度只有118英尺。如果选择数据增加了1000英尺,他们预计树的高度也增长了7倍。

不过West 与 colleagues 研究的重点不是证实这种已知的心理偏差,他们想理解这些心理偏差与人类智商之间的联系。他们采用各种认知测试方法来检测心理偏差,包括SAT 考试,认知需求量表,可以评估“个人思维的偏向性。测试结果令人非常不安。比如自我意识度并不是非常有用:科学家们说,“人们即使能意识到自己的思维存在偏见,可是他们无法克服这种偏见。”Kahneman 对这个结果并不感到吃惊,他在 《或快或慢的思考》

中写道,在他所处的这个时代里,一些非常重要的研究调查还是不能很好地让他提高自己的心理素质表现。“我的直觉往往让我很自信,做出极端的预言,然后导致错误的做法。”这些往往低估了人们需要花多少时间去完成一项任务----“在我做这些课题的研究之前,情况就已经是这样了。”也许对我们最有害的偏见性思维是我们通常认为其它人更有可能做出错误的思考,这被称为“偏见掩盖了事实”。这种“心理上的偏见”令我们能更好地发现别人所做决定中的一些系统性错误----我们往往很容易发现朋友的缺点---却不能很好地发现出现在自己身上的相同的缺点。虽然这种“偏见掩盖了事实”并不是什么新的话题了,West 最新的论文里表明这种思维在很多单一的思维模式中都是存在的。在每种思维模式中,我们往往觉得自己的思想是对的,对别人的想法却是鸡蛋里挑骨头。

还有一些让人不安的方面:智慧会让事情变得更糟。科学家们给学生们四种“认知能力”的测量标准。他们在论文中写道,这四种标准编码有正面的联系,“认知能力较强的人,更会因为自己的偏见而不能发现错误。”这对于各种方面的偏见都是适用的,聪明的人(至少他们的SAT 成绩考得不错)与喜欢深思熟虑的人更加趋向于受到普通错误心理常识的影响。接受教育的程度也不能改变这个现象;很多年前,Kahneman 与 Shane Frederick就认为,哈佛大学,普林斯顿大学,麻省理工学院,这些大学里的50%以上学生都错误地回答了拿到关于球拍与球的问题。

这样的结果该如何解释呢?其中一个大胆的假设就是,因为我们对自己与对他人的认识的差异,导致了这种偏见性思维。比如说,当想到一个陌生人所做的不合理的选择时,我们往往依赖于一些行为上的信息来做判断;我们根据外表产生了对他们的偏见,这样就使我们看到的知识他们系统性的错误性思维。然而,当我们看自己所做的一些错误选择时,我们通过慎密的反省思考来看待它们。我们很仔细的研究自己的动机密表,然后查找相关的支持因素;我们为自己的错误感到难过,并会找出改正的方法,并深深思考该怎么去做。

真正导致偏见性思维,即我们做出不合理事情的原因是我们这样做是无意识的,我们在自我反省时候意识不到这些,也没法用智慧来解决它。事实上,自我反省令情况变得更糟,我们会因此忽视一些导致我们生活工作出错的主要因素。我们给自己很好的理由,可是这些理由并不能解决问题。我们越是想了解自己,我们对自己了解的就越少。

Research Shows That the Smarter People Are, the More Susceptible They Are to Cognitive Bias 发布时间:2012-06-13

Here’s a simple arithmetic question: A bat and ball cost a dollar and ten cents. The bat costs a dollar more than the ball. How much does the ball cost?

The vast majority of people respond quickly and confidently, insisting the ball costs ten cents. This answer is both obvious and wrong. (The correct answer is five cents for the ball and a dollar and five cents for the bat.)

For more than five decades, Daniel Kahneman, a Nobel Laureate and professor of psychology at Princeton, has been asking questions like this and analyzing our answers. His disarmingly simple experiments have profoundly changed the way we think about thinking. While philosophers, economists, and social scientists had assumed for centuries that human beings are rational agents—reason was our Promethean gift—Kahneman and his scientific partner, the late Amos Tversky, demonstrated that we’re not nearly as rational as we like to believe.

When people face an uncertain situation, they don’t carefully evaluate the information or look up relevant statistics. Instead, their decisions depend on a long list of mental shortcuts, which often lead them to make foolish decisions. These shortcuts aren’t a faster way of doing the math; they’re a way of skipping the math altogether. Asked about the bat and the ball, we forget our arithmetic lessons and instead default to the answer that requires the least mental effort.

Although Kahneman is now widely recognized as one of the most influential psychologists of the twentieth century, his work was dismissed for years. Kahneman recounts how one

eminent American philosopher, after hearing about his research, quickly turned away, saying, “I am not interested in the psychology of stupidity.”

The philosopher, it turns out, got it backward. A new study in the Journal of Personality and Social Psychology led by Richard West at James Madison University and Keith Stanovich at the University of Toronto suggests that, in many instances, smarter people are more vulnerable to these thinking errors. Although we assume that intelligence is a buffer against bias—that’s why those with higher S.A.T. scores think they are less prone to these universal thinking mistakes—it can actually be a subtle curse.

West and his colleagues began by giving four hundred and eighty-two undergraduates a questionnaire featuring a variety of classic bias problems. Here’s a example:

Your first response is probably to take a shortcut, and to divide the final answer by half. That leads you to twenty-four days. But that’s wrong. The correct solution is forty-seven days.

West also gave a puzzle that measured subjects’ vulnerability to something called “anchoring bias, ” which Kahneman and Tversky had demonstrated in the

nineteen-seventies. Subjects were first asked if the tallest redwood tree in the world was more than X feet, with X ranging from eighty-five to a thousand feet. Then the students were asked to estimate the height of the tallest redwood tree in the world. Students

exposed to a small “anchor”—like eighty-five feet—guessed, on average, that the tallest tree in the world was only a hundred and eighteen feet. Given an anchor of a thousand feet, their estimates increased seven-fold.

But West and colleagues weren’t simply interested in reconfirming the known biases of the human mind. Rather, they wanted to understand how these biases correlated with human intelligence. As a result, they interspersed their tests of bias with various cognitive

measurements, including the S.A.T. and the Need for Cognition Scale, which measures “the tendency for an individual to engage in and enjoy thinking.”

The results were quite disturbing. For one thing, self-awareness was not particularly

useful: as the scientists note, “people who were aware of their own biases were not better able to overcome them.” This finding wouldn’t surprise Kahneman, who admits in “Thinking, Fast and Slow” that his decades of groundbreaking research have failed to significantly improve his own mental performance. “My intuitive thinking is just as prone to overconfidence, extreme predictions, and the planning fallacy”—a tendency to

underestimate how long it will take to complete a task—“as it was before I made a study of these issues, ” he writes.

Perhaps our most dangerous bias is that we naturally assume that everyone else is more susceptible to thinking errors, a tendency known as the “bias blind spot.” This “meta-bias” is rooted in our ability to spot systematic mistakes in the decisions of others—we excel at noticing the flaws of friends—and inability to spot those same mistakes in ourselves. Although the bias blind spot itself isn’t a new concept, West’s latest paper demonstrates that it applies to every single bias under consideration, from anchoring to so-called “framing effects.” In each instance, we readily forgive our own minds but look harshly upon the minds of other people.

And here’s the upsetting punch line: intelligence seems to make things worse. The

scientists gave the students four measures of “cognitive sophistication.” As they report in the paper, all four of the measures showed positive correlations, “indicating that more cognitively sophisticated participants showed larger bias blind spots.” This trend held for many of the specific biases, indicating that smarter people (at least as measured by S.A.T. scores) and those more likely to engage in deliberation were slightly more vulnerable to common mental mistakes. Education also isn’t a savior; as Kahneman and Shane

Frederick first noted many years ago, more than fifty per cent of students at Harvard, Princeton, and M.I.T. gave the incorrect answer to the bat-and-ball question.

What explains this result? One provocative hypothesis is that the bias blind spot arises because of a mismatch between how we evaluate others and how we evaluate ourselves. When considering the irrational choices of a stranger, for instance, we are forced to rely on behavioral information; we see their biases from the outside, which allows us to

glimpse their systematic thinking errors. However, when assessing our own bad choices, we tend to engage in elaborate introspection. We scrutinize our motivations and search for relevant reasons; we lament our mistakes to therapists and ruminate on the beliefs that led us astray.

The problem with this introspective approach is that the driving forces behind biases—the root causes of our irrationality—are largely unconscious, which means they remain invisible to self-analysis and impermeable to intelligence. In fact, introspection can

actually compound the error, blinding us to those primal processes responsible for many of our everyday failings. We spin eloquent stories, but these stories miss the point. The more we attempt to know ourselves, the less we actually understand.