The award for the scariest book I’ve read all year goes to David Neiwert’s Red Pill, Blue Pill: How to Counteract the Conspiracy Theories That Are Killing Us. You can be aware of the alt right and the white supremacists and QAnon and the rest of them and be concerned, maybe even scared of meeting with their violence in a specific context. But taking a deep, sustained look at the scope and reach of these conspiracy theories is something different.
Since Neiwert is my coworker at Daily Kos, readers would reasonably suspect the fix was in if I tried to write a standard book review of Red Pill, Blue Pill. What I want to do instead is sketch out four key things I learned from the book, information that will expand my understanding of not just far-right extremists but U.S. politics more generally going forward, as well as two big questions I have after reading the book—one of them an urgent question in the context of the upcoming elections.
First, let’s be clear that there’s a difference between believing in a conspiracy—and conspiracies do happen!—and believing in a conspiracy theory. Conspiracy theories, Neiwert explains, “almost universally feature qualities that contrast sharply” with the limits of real-life conspiracies, including that they allege much larger conspiracies than the generally narrow scope of real-life conspiracies and involve far more people than could ever successfully keep such a major secret.
But while conspiracy theories violate the rules of actual conspiracies, they do have their own rules, which relates to maybe the biggest thing I learned from Red Pill, Blue Pill:
Conspiracy theorists “never believe just one conspiracy theory but rather an interconnected web of them,” Neiwert writes. That interconnected web of conspiracy theories “represent a deeper truth about their world while repeatedly reinforcing their long-held prejudices and enable them to ignore the real, factual (and often uncomfortable) nature of the changes the world is undergoing.”
Today, that means a set of theories building on decades of anti-government sentiment, centuries of anti-Semitism, the concept of “cultural Marxism,” and, of course, white supremacist beliefs. You throw together “cultural Marxism” and “political correctness” and the Illuminati and general hatred of women and people of color and Jewish people, stir in a lot of bits and pieces of birtherism and 9/11 trutherism, accusations that Sandy Hook was a false flag and its victims were crisis actors, and conspiracy theories based on the same old hatreds and insecurities become something new. Pizzagate looks outlandish and then chunks of it are lifted into QAnon, which is even more outlandish, but in the new form it really takes off.
The contours of these conspiracy theories, ever shifting but drawing on so many of the same ideas and building on each other, make clear what a big, big problem we’re looking at. Any one such theory may seem fringe (until it doesn’t anymore), but the constant churn of them shouldn’t be underestimated. And understanding the degree to which they’re interconnected shows both the difficulty of breaking their hold and the importance of preventing them from taking root to begin with.
Many of—though unlikely all—the cases Neiwert summarizes may ring a bell. But speaking for myself, seeing Timothy McVeigh and Eric Rudolph and Stephen Paddock and Anders Breivik and Elliot Rodger and Dylann Roof and Alek Minassian and Jeremy Christian and Buckey Wolfe and Brenton Tarrant and Lane Davis one after another, with connections drawn between the specifics of what they believed, the conspiracy theory mode of their beliefs, and the crimes they committed is in itself a powerful case that conspiracy theories are incredibly dangerous. The men on this list have killed their family members, they have killed strangers, they have killed with guns and bombs and vehicles and knives, and they didn’t just do it out of nowhere. Neiwert traces the killers’ embrace of conspiracy theories and their spiraling descents to the point at which they became killers, in many cases mass killers.
Law enforcement investigators refused to connect the dots between the most deadly mass shooting in U.S. history and Stephen Paddock’s obsession with guns, hatred of the government, opposition to taxes, and interest in 9/11 conspiracy theories. “That meant,” Neiwert writes, “that the deadliest mass shooting by an individual in American history was committed for reasons that law enforcement officials couldn’t explain—to the victims, their families, the survivors, or to the public. The conspiracy theories that Paddock believed in, in this calculus, could not count as a motive.”
In other cases, of course, killers left behind manifestos explaining their hatred of the groups they’d targeted for murder, and offering a road map of the conspiracy theories woven into the planning of their crimes.
As you may have read—or observed personally—social media platforms, YouTube in particular, constantly funnel you to content they think you might like, which is to say content you might click on and bring them more revenue. Neiwert quotes sociologist Zeynep Tufekci’s TED talk explaining how it works: “The algorithm has figured out that if you can entice people into thinking that you can show them something more hardcore, they’re more likely to stay on the site watching video after video going down that rabbit hole while Google serves them ads.” When people start with right-wing content, that means in practices that they quickly get funneled into white supremacist and conspiracy theorist content.
But people’s own searches for information can produce similar results. Neiwert looks at Dylann Roof’s famous, radicalizing search for “black on white crime.” Information scientist Safiya Umoja Noble has explained that Roof’s “very framing of the problems of race relations in the U.S. through an inquiry such as ‘black on white crime’ reveals how search results belie any ability to intercede in the framing of a question itself.” That is, if you search for a term that only the far right uses, you’re going to get a lot of white supremacist search results.
Information scientist Michael Caulfield suggested to Neiwert that, in such searches, “students are practicing info-literacy as they’ve learned it”—they search for a term, and then they search for factual confirmation of the first results they find, often researching quite deeply within that one mode. But what they don’t do is step back and question the framing of those results. To help people learn not to embrace false claims and conspiracy theories, Caulfield has developed a model called SIFT: Stop; Investigate the source; Find better coverage; Trace claims, quotes, and media to the original context.
The big question, of course, is how we break the hold of conspiracy theories on individuals and on the nation’s politics. Neiwert makes clear that even at the person-by-person level, it is incredibly difficult. (Though he has a lot of advice for trying that may be highly relevant for friends and family members of QAnon adherents who are facing such a tough situation.) But the first step he offers is “the most effective way of overcoming the effects of ‘red-pilling’ is immunizing people beforehand.” That’s where SIFT comes in, for one thing.
Psychologist Stephan Lewandowsky also told Neiwert that a colleague “finds that if you expose people to a small dose, just like a vaccine, of the conspiracy theory up front, then it finds less traction when people are actually exposed to it. So you can educate, sort of protect people against conspiracy theories, but of course the crucial is you got to get to them first, because if you try to do it after they’ve already been exposed to it then it’s far less effective.”
Questions I was left with: