Google is testing “prebunking” strategies aimed at “inoculating people against manipulation” and “misinformation” online, researchers say.
Researchers from Google and the UK’s Cambridge University teamed up to conduct experiments that involved five short videos aimed at “inoculating people against manipulation techniques commonly used in misinformation,” according to a paper published Aug. 24 in the journal Science Advances.
The study, titled “Psychological Inoculation Improves Resilience Against Misinformation on Social Media,” involved nearly 30,000 participants. Other authors included researchers at the UK’s University of Bristol and the University of Western Australia.
The manipulation techniques commonly used in misinformation are “emotionally manipulative language, incoherence, false dichotomies, scapegoating, and ad hominem attacks,” the researchers said.
The participants were shown 90-second videos aimed at familiarizing watchers with techniques such as scapegoating and deliberate incoherence. The videos introduced concepts from the “misinformation playbook,” according to researchers, and explained to viewers in simple terms some of the most common manipulation techniques, using fictional characters as opposed to real political or media figures.
Researchers then gave people a “micro-dose” of misinformation in the form of relatable examples from film and TV, such as “Family Guy.”
They found that the videos “improved manipulation technique recognition” and boosted watchers’ confidence in spotting these techniques, while also “increasing people’s ability to discern trustworthy from untrustworthy content.”
The videos also “improve the quality of their sharing decisions,” researchers said.
‘Improving Misinformation Resilience’
“These effects are robust across the political spectrum and a wide variety of covariates,” they wrote. “We show that psychological inoculation campaigns on social media are effective at improving misinformation resilience at scale.”
“Online misinformation continues to have adverse consequences for society,” the study states. “Inoculation theory has been put forward as a way to reduce susceptibility to misinformation by informing people about how they might be misinformed, but its scalability has been elusive both at a theoretical level and a practical level.”
Among the “misinformation” cited by researchers in the study is that relating to COVID-19, which has “been linked to reduced willingness to get vaccinated against the disease and lower intentions to comply with public health measures.”
Multiple studies have shown that vaccines are linked to two types of heart inflammation: myocarditis and pericarditis, and U.S. authorities have acknowledged a link between the Pfizer and Moderna vaccines and heart inflammation.
However, they state that the benefits of the shots outweigh the risks.
The authors in the study compared the videos to vaccines, stating that giving people a “micro-dose” of misinformation helps prevent them from being susceptible in the future, much like medical inoculations help build resistance against pathogens.
‘Works Like a Vaccine’
The idea is based on what social psychologists call “inoculation theory”—building resistance to persuasion attempts via exposure to persuasive communications that can be easily refuted.
Google already is harnessing the findings and plans to roll out a “prebunking campaign” across several platforms in Poland, Slovakia, and the Czech Republic, in an effort to stem emerging disinformation relating to Ukrainian refugees. The campaign is in partnership with local nongovernmental organizations, fact-checkers, academics, and disinformation experts.
“Our interventions make no claims about what is true or a fact, which is often disputed,” lead author Dr. Jon Roozenbeek from Cambridge’s Social Decision-Making Lab (SDML) said in a statement. “They are effective for anyone who does not appreciate being manipulated.
“The inoculation effect was consistent across liberals and conservatives. It worked for people with different levels of education, and different personality types.”
“YouTube has well over 2 billion active users worldwide,” said professor Sander van der Linden, the head of SDML and a study co-author. “Our videos could easily be embedded within the ad space on YouTube to prebunk misinformation.”