In a fascinating and somewhat controversial newish preprint, a group of MIT and Wellesley researchers dive deep into COVID anti-mask belief, relaying insights both from a quantitative study of thousands of images and from qualitative “infiltration” of anti-mask Facebook groups. Their core finding is one that everyone working in news, misinformation, and related fields ought to internalize:
Far from ignoring scientific evidence to argue for individual freedom, antimaskers often engage deeply with public datasets and make what we call “counter-visualizations”—visualizations using orthodox methods to make unorthodox arguments—to challenge mainstream narratives that the pandemic is urgent and ongoing. By asking community members to “follow the data,” these groups mobilize data visualizations to support significant local changes.
It’s tempting to think of all conspiracy theorists as uneducated, unintelligent, insane, or bad-faith actors. But we’re increasingly finding that the inverse is often true, particularly when it comes to areas like climate change where science and politics intersect.
I think we need to start most of our analyses with a basic amount of empathy. Knowing is hard. The world is an incredibly complex place, made nearly indecipherable by our species’ limited senses, which may not reflect reality in any meaningful way. It’s a wonder we manage to get anything done at all.
Despite, or perhaps because of, these shortcomings, humans are incredibly good at recognizing patterns. In fact, we’re much too good. We routinely see patterns where none exist, spawning false beliefs ranging from potentially self-preserving to responsible for many millions of deaths over thousands of years.
The scientific method provides useful guardrails against this tendency to find false patterns, which is why using it has yielded so much progress. But it’s far from perfect and notoriously hard to follow, particularly if you come into research with any preformed opinions, like that you shouldn’t need to wear a mask.
The same week the anti-mask preprint was making the rounds, Dartmouth archaeologist Flint Dibble wrote a long and fascinating Twitter thread dissecting the Discovery Channel’s upcoming “Hunting Atlantis” show, based on research by presenter Stel Pavlou.
To keep this post brief, I’m skipping some important meta-level discussion. That Pavlou, a science fiction writer, and his cohost Jess Phoenix, a vulcanologist, are being given a show by a major cable network to discuss a topic way outside of their expertise is ridiculous on its face. The show should never have been made. But it was, and it gave Dibble the opportunity to write his commentary.
The thread is worth a read on its own. It’s (mostly) a masterclass in constructively responding to conspiracy theories while demonstrating how expertise works in the real world and why it’s important.
Rather than dismissing Pavlou’s claims out of hand, Dibble acknowledges that they did a lot of clever research. And he takes the paper’s central claim — that a close reading of Plato reveals that Atlantis existed much more recently than we’ve always assumed, which is why we haven’t yet discovered evidence for it — seriously. From that point, Dibble uses his expertise to show why the evidence doesn’t match the paper’s claims, mostly because Pavlou relies on an incomplete understanding of the archaeological record and a poor reading of Plato to make his case.
In other words, Dibble doesn’t just debunk Pavlou by providing an alternative set of facts, he shows readers the errors Pavlou made via a thoughtful, evidence-based narrative that ought to feel aesthetically familiar to a conspiracist. It’s a textbook example of the latest research in effective debunking, which can not only cut through misinformation but, by exposing readers to common fallacies and misinformation patterns, provide inoculation against future misinformation.
I don’t believe Dibble’s approach offers a silver bullet for countering all conspiracy theories. There are larger, more systematic issues with our information environment we need to grapple with. But within this environment, Dibble’s approach offers a refreshing, effective, and even fun (if exhausting) method for understanding and countering conspiracies and the people who spread them. Rather than being dismissive, we should view the similarities between the processes real researchers use and the ones conspiracy theorists grab onto as a bridge to teach them about fallacies and show them the value of expertise. I hope more educators, policymakers, and journalists follow the evidence themselves, and learn from examples like Dibble’s.