INTERVIEW with Sander van der Linden on "FOOLPROOF: Why We Fall for Misinformation"
Issue 67: We learn about "the ultimate handbook for fighting back against the tsunami of misinformation that threatens to drown us in bullshit.”
This week, we talked with Sander van der Linden, professor at the University of Cambridge about his new book, FOOLPROOF. His research looks at how people process (mis)information, how it spreads in online networks, and what behavioral interventions we can design to counter it at-scale. He is dubbed Cambridge’s ‘defence against the dark arts’ teacher and he created the BAD NEWS GAME where you can practice manipulating people with outrage-inducing, fake news.
Sander’s new book, FOOLPROOF: Why We Fall for Misinformation, describes how to inoculate yourself and others against the spread of misinformation, discern fact from fiction and push back against methods of mass persuasion. It is based on many studies and provides a real state-of-the-art look at the topic.
FOOLPROOF was released in the bookstores last week. It will be released in the U.S. on March 23rd, and you can pre-order the book here. Jay read an advance copy of the book last summer and he was asked to write a blurb for the book cover. The book is fantastic and here are sample blurbs he wrote (we won’t know if any made the cover until it hits book stories in the US):
"FOOLPROOF is the ultimate handbook for navigating--and fighting back--against the tsunami of misinformation that threatens to drown us in bullshit.”
"Sander van der Linden is one of the world's leading experts on strategies to combat misinformation. From prebunking to debunking, his book provides the best strategies for building your immunity against fake news, propaganda, and conspiracy theories."
"Sander van der Linden explains the dark art of misinformation and gives you the tools to defend yourself. Whether you are facing a sketchy internet rumor or listening to your uncle spin another conspiracy theory, this book will give you the skill set to detect false ideas and debunk them to others."
What does your book teach us about group dynamics?
Group dynamics are absolutely key for trying to understand the spread of misinformation online. For example, in one study, we looked at millions of posts on both Facebook and Twitter and we consistently find that when people use language that "dunks" on the other side, the post receives more engagement on social media (at least in the US).
So if you're a liberal, a negative post about conservatives (the "out-group") will receive much more traction than a positive post about liberals (the "in-group") and vice versa for conservatives. We refer to this as the "perverse incentives" of social media. The incentive structure of social media is important because it can feed into division and polarization. In fact, the definition of an echo chamber is that (a) like-minded people cluster together around an issue or topic ("birds of a feather flock together") and (b) polarize away from each other within a given network.
Research finds that echo chambers can facilitate the spread of misinformation because when the misinformation resonates with the views of the echo chamber it can act as the catalyst or bandwagon for its viral diffusion. At the same time, echo chambers can also impede the spread of corrections because when the fact-checks do not resonate with the opinions of the echo chamber, they won't travel very far and we basically end up just preaching to the choir. So group dynamics matter a lot for what information we are exposed to in society.
What was the most surprising thing you learned as you were writing the book?
That a guy in Seattle actually shoved a sword through his brother's head because he thought he was a shapeshifting lizard. Incredibly dark and sad but it also highlights that conspiracy theories can get people killed.
What is the biggest unanswered question on this topic going forward?
One of the most frequent questions I get is how to change the minds of conspiracy theorists. Unfortunately, conspiracy theorists are notoriously reluctant to participate in research studies (you guessed it, they typically assume it's part of a nefarious plot against them!).
So one of the big issues here is actually studying the effectiveness of interventions to counter misinformation in (a) not only real-world social media settings but also (b) with audiences who could arguably benefit the most from such interventions.
Do you have any practical advice for people who want to apply these ideas?
So prebunking or psychological inoculation follows the biomedical analogy: just as exposing people to an inactivated strain of a virus triggers the production of antibodies to help fight off future infection, it turns out you can do the same with (mis)information.
The premise of psychological inoculation is therefore not necessarily to give people more facts and evidence. Instead, it's based on the assumption that people have no mental defenses against a particular incoming attempt to manipulate their opinion and that they need both the ability and the motivation to start generating mental antibodies. Although facts can play a role in this process, it's really about the notion of giving people a weakened dose or a simulation of the type of tactics they might be facing in the future, so they can start building resilience now (rather than debunking a falsehood after it's already spread, which as I discuss at length in the book, is much more difficult).
I often find that what we call "technique-level" inoculation works well in real-life conversations. In the book, I go through many techniques that are prevalent in the spread of misinformation. One such technique is called "false dichotomies"; a particular story might make it seem as if you only have two options while in fact there are more (e.g., consider the following headline or social media post "we can't be wasting money on immigrants in this country until we take care of our domestic homelessness problem").
In reality, you could easily do both but the manipulator only wants you to see two options. The way you could inoculate people against this sort of stuff is not by talking about polarizing issues. Instead, leverage the power of Star Wars! For example, in one of our experiments, the weakened dose consisted of a scenario from Episode III: Revenge of the Sith where Anakin Skywalker says to Obi-Wan "If you're not with me....you're my enemy!" (a false dichotomy) upon which Obi-Wan replies "Only a Sith deals in absolutes". When we later tested millions of people on Youtube with the "full dose" of misinformation that makes use of this technique, they were better able to spot this tactic and had thus been inoculated.
Just as your body benefits from lots of examples of the invading pathogen in order to mount an effective immune response, it works the same with your mind. The more weakened examples of the tactics that are used to spread misinformation you can prebunk for people (I cover the "Six Degrees of Manipulation" in the book), the better we are going to become at neutralizing them.
But here's the secret sauce: make it fun, keep it light and non-confrontational, and identify and inoculate against the underlying tactic rather than just arguing over specific facts.