How Do I Know If I'm Wrong?
The last post was a hat-tip to humility. An invitation to cultivate the reflex of asking ourselves, at every turn, “What if I’m wrong?” But the truth is that nobody—unless they’re George Costanza—is wrong all the time. So the next step is getting better at detecting when we’re wrong.
A couple of weeks ago, Zeynep Tufecki boiled over in a guest essay in The New York Times . The World Health Organization had just issued an explainer about how COVID-19 mainly spreads. It’s not via short-range “droplets” that arc to the floor within a six-foot danger zone. And it’s not by sticking to surfaces, like your mail or your groceries. The prime culprit is “aerosols”—tiny floating particles that however in indoor spaces, spreading their payload.
Tufecki, a Turkish sociologist, wasn’t mad because the WHO was broadcasting this news: the news was true. She was mad that it took more than a year for that news to hit the mainstream. After all, a small group of experts had been promoting the aerosol theory as far back as March of last year. Their voices weren’t heard. That’s because influential scientists who thought otherwise—because that’s the model we knew from earlier respiratory diseases—were so invested in their beliefs that they couldn’t see the truth that was… suspended right in front of their face.
Tufecki likens it to the way the ancients couldn’t admit they were wrong that all celestial bodies revolved around the Earth—even after Copernicus produced pretty good evidence to the contrary. The astronomers just “produced ever-more-complex charts… to fit the heavens to their beliefs.”
Basically, we interpret things through a lens that serves us. And it’s very hard to imagine that that lens might be warped.
“We feel that we see the world as it is, and that what we see is real, what we see is true,” the psychologist Daniel Kahneman told an interviewer recently. "And that makes it very difficult to imagine that someone else looking at the same reality is going to see it differently.”
Social scientists call this cognitive blind spot “naive realism.” Sometimes it takes a chance disruption to help us see past it.
A couple of years ago, a meme roared through cyberspace. It was an audio clip. An American high school student had stumbled upon a strange phenomenon. A man pronounces a person’s name, and it turns out people hear it completely differently. Whether you think he’s saying “Yanny” or “Laurel” depends on a number of factors, including age-related hearing loss and your own expectations.
The takeaway? What you would swear in court is “true” may only be true for you. From my perspective, it’s false. Because you and I are perceiving the world through different filters.
Steel-Manning and Agere Contra
When we’re evaluating new information, very often the only stuff that gets past our filters and even registers is the stuff that confirms our existing beliefs. “We hear and apprehend only what we already half-know,” as Thoreau put it.
To overcome our biases, it sometimes helps to have an accomplice. Like the smartest person you know who disagrees with you.
Tufecki routinely calls up her most formidable critics and invites them to write guest blogs on her newsletter. “ I pay people to try to take me down ,” she says.
This is sometimes called the “steel man” strategy. (As against the more familiar “straw man” strategy, where you reduce an argument to its flimsiest parts and then blow them away.) Steel-manning amounts to having someone beat the crap out of your own position. If after that you still aren’t tempted to switch sides, that’ll bolster your confidence that your thinking is sound.
But sometimes you don’t need a supersmart adversary to help you decide what square to put your chips on. Sometimes—and this may apply more to matters of the heart than matters of fact or science—all you need is your own mind, and time. Jesuit tradition has a practice called “Agere Contra—"to act against.” It starts with learning to be actively suspicious of your own instincts. Trust your gut isn’t always the best advice. Sometimes your gut isn’t steering you right (sorry Malcolm) out of its mighty primitive intelligence; it’s steering you wrong out of fear. Fear of change, fear of disrupting your cozy sinecure. So here’s the experiment: Disobey that first signal. And then hang in there and wait for the perhaps truer insight coming up behind it.
Time often brings clarity. “How would I respond a year from now?” That’s a question Igor Grossman, a psychologist who runs the Wisdom and Culture Lab at the University of Waterloo, recommends people ask themselves when they’re uncertain or stuck. Studies show that such “temporal distancing” strategies can help you bring more wisdom to bear on the call you have to make right now.
From her research on decision-making, the writer Julia Galef concludes that making accurate predictions, and good judgments, depends a lot on the mindset we’re in.
She devotes a chapter of her new book The Scout Mindset to ways we can make ourselves “more receptive to unpleasant or inconvenient truths—or things that might be true.” Before even trying to evaluate if this odious (to you) idea is true, ask yourself: If it were true, how much would it hurt me?
“I try to stop and say, ‘What if I found out that this critique was actually solid?’” Would the Earth stop turning? Would I have a nervous breakdown? Probably not. She would, she decided, just… adjust. And very likely grow. Once we’ve reached that state of pre-acceptance of the possibility that this opposing view is actually more correct than our own, we’re able to think much more clearly about whether to cling to that position or let it go.
This is all just good “decision hygiene,” to borrow Danny Kahneman’s phrase. Definitely a worthwhile habit to cultivate.
As we wash our hands for twenty seconds multiple times a day, maybe we should also wash our minds.