by Kim Pederson…….
“The vaunted human capacity for reason may have more to do with winning arguments than with thinking straight.” So goes the picture caption to a New Yorker article by Elizabeth Kolbert titled “Why Facts Don’t Change Our Minds.” She begins her piece by describing a 1970s Stanford University research project that asked some undergraduates to distinguish between real and fake suicide notes. Some were told they did really well and some were told they did really poorly. In reality, they all did about the same. The hidden purpose of the study (all psychology experiments thrive on lacksparency [the intentional lack of transparency]) was to see how the subjects responded after the researchers revealed their true purpose. Even knowing they did average, the subjects who were told they excelled said they thought they had done really well and vice versa. “Once formed,” the researchers noted, “impressions are remarkably perseverant.” Scientists doing another study like this one found similar results. Despite seeing that the evidence “for their beliefs has been totally refuted, people fail to make appropriate revisions in those beliefs.”
Kolbert poses the question, “How did we come to be this way?” She then describes the answer that two Harvard researchers posit in their book The Enigma of Reason. They say, “Reason developed not to enable us to solve abstract, logical problems or even to help us draw conclusions from unfamiliar data; rather, it developed to resolve problems posed by living in collaborative groups.” The biggest advantage humans have over other species, these two assert, is our ability to cooperate, something that might seem extremely doubtful these days.
The same two researchers also note that a large obstacle to our ability to discern truth (this is an all-inclusive “our”) is confirmation bias, our tendency to embrace information that supports our beliefs and reject information that contradicts them. In addition, they point out that while we’re good at spotting weaknesses in other people’s arguments, “almost invariably the positions we’re blind about are our own.” The researchers argue that if this trait were negative and harmful, genetically speaking, it should have been “selected against.” Since it wasn’t, “it must have some adaptive function,” a function they say relates to our “hypersociability.”
This sociability, the tendency to group together, did us good back in the days of wooly mammoths and saber-toothed tigers. Now, not so much. One thing we suffer from in our sociability is the “illusion of explanatory depth,” that is, our thinking we know way more than we actually know. In some places, this doesn’t hurt us much. For example, we can toggle the handle on a toilet without knowing how it works and the toilet will flush (usually). In other areas, again, not so much. For instance, in another study, the less able people were able to locate Ukraine correctly in the world, the more they favored US military intervention there.
In similar fashion, the less we know about President Trump’s “immigration ban,” the more likely we are to strongly favor or oppose it. The more others in our circles, whatever they may be, concur with and confirm our opinions, the more we resist or flat-out disbelieve and dismiss anything that counters those opinions. “This is how,” two other scientists quoted by Kolbert observe, “a community of knowledge becomes dangerous.” She then concludes, perhaps oversimplifying, “If we all now dismiss as unconvincing any information that contradicts our opinion, you get, well, the Trump Administration.” Whatever side of that perspective you come down on, what seems incontrovertible is that our reasoning abilities and methods haven’t evolved past the Pleistocene. Somehow, I don’t find that the least bit surprising.
*Time Saving Truth from Falsehood and Envy, François Lemoyne, 1737. Public Domain.
Visit Kim Pederson’s blog RatBlurt: Mostly Random Short-Attention-Span Musings.