Daniel Kahnemann is a Noble laureate in economics and a psychologist at Princeton University. Dr. Kahnemann, in collaboration with others, is a major force in turning the discipline of psychology into the realm of science. In this discipline, the underlying, fuzzy-at-the-edges postulate by the pioneers of western psychology remains fairly intact: human beings are primarily not rational-thinking agents, making disinterested observations and carefully-weighed, calculated decisions. That’s not to say that people don’t make observations, carefully weigh their options and then calculate the benefits of their decisions; instead, streams of mental activities bubbling from the unconscious rush above the conscious surface, setting frames of reference that influence our rational thinking in a fundamental way.
the “smarter” one is, the more prone one is to these biases and therefore thinking errors.
Kahnemann and his colleagues think that these unconscious mental activities create “blind spots” for us, which, in combination with laziness and the inclination to take mental shortcuts, leads to thinking errors more often than not. A recent New Yorker article expanded upon an October 2011 review of Kahnemann’s most recent book “Thinking, Fast and Slow” in the Wall Street Journal and discusses a recent study suggesting that people who are “smarter” in particular ways are actually more prone to thinking errors.
Richard West and Keith Stanovich are the lead authors of a recent study cited in the New Yorker article that examines the relationship between certain measures of intelligence and human biases and tendencies to make thinking errors. Hundreds of undergraduate students were West’s and Stanovich’s subjects, and the variety of measures for intelligence include S.A.T scores and the Need for Cognition Scale. The New Yorker article highlighted a couple of interesting mental biases exhibited by the subjects of the experiment described in the study. The first one is that simply having self-awareness of one’s own biases does not mean one is not vulnerable to them. Another “blind spot” exhibited strongly by the subjects is the assumption that others are more susceptible to thinking errors than oneself. Finally, positive correlations exist between “cognitive sophistication” and cognitive blind spots–the “smarter” one is, the more prone one is to these biases and therefore thinking errors.
Obviously using S.A.T. scores as a measure for intelligence is at least incomplete. Students well drilled to succeed in standardized tests are trained to take many mental shortcuts during the tests so as to be fast–a key factor for doing well on standardized tests. However, here are two interesting conclusions one can draw from the study results: The first is that we think others are more susceptible to thinking errors than we are, and are exceedingly good at spotting flaws and mistakes in others’ thinking. Second, that cognitive introspective reflection and deliberation could actually reinforce biases that lead to larger mental blind spots. (Perhaps, this gives an alternative and complementary explanation for how the bankers, arguably some of the smartest people in the world, failed to recognize the overwhelming danger and risk of the type of financial investment they were engaging that ultimately took down the economy — besides the apparent greed).Many if not all Buddhist practices are targeted toward relieving ourselves gradually of these biases, to clarify the ultimate mental blind spots.
Vasubandhu, an ancient Indian philosopher whose writing became part of the basis for Yogacara, a school which organized much of the Buddha’s teachings on consciousness, wrote in one of his treatises that the primary afflictions associated with the manas–the self-reifying region of the unconscious–are self-delusion, self-view, self-conceit, and self-love. The primary afflictions, or klesa, react or interact with present conditions to color, frame and influence our experiences and shackle our reactions. Since these are the most basic mental afflictions, they pervade all of our experiences, all the time. Many if not all Buddhist practices are targeted toward relieving ourselves gradually of these biases, to clarify the ultimate mental blind spots.
In the struggle to create modern liberal education programs that leverage both ancient wisdom and new scientific findings, it comes as no coincidence that the first of the three institutional learning outcome of DRBU is “flexibility of mind”:
A liberally educated person can formulate and practice a modus operandi characterized by flexibility and sound judgement to constantly assess new and evolving conditions, both inside and outside, and accordingly to reconsider, adjust, alter or even abandon one’s course or stance.
To create the curricula and pedagogies needed to to cultivate this flexibility is a challenging endeavor. However, some of the potential pedagogical practices are looking more sound, given the results of West and Stanovich’s study. At colleges that employ a strong seminar style of instruction such as St John’s College (Maryland and New Mexico), students are asked to not only form their arguments strictly from reading of primary texts, but also to put their arguments and statements on the table and subject them to the scrutiny of fellow learners in the seminar. Given our tendencies to better spot the fallacies in other people’s thinking than those of our own, this pedagogical practice teaches students to not only think critically and logically, to communicate clearly and concisely, but also to cultivate an attitude that is open receiving criticism and advice, and to rely on a community as a check for one’s own fallacies.