Voters at Miami Beach City Hall, during midterm primaries in Miami Beach, Fla. on Tuesday, Aug. 23, 2022. Researchers at Google, the University of Cambridge and the University of Bristol tested a different approach against online misinformation that tries to undermine it before people see it; they call it “pre-bunking.” (Scott McIntyre/The New York Times)
Voters at Miami Beach City Hall, during midterm primaries in Miami Beach, Fla. on Tuesday, Aug. 23, 2022. Researchers at Google, the University of Cambridge and the University of Bristol tested a different approach against online misinformation that tries to undermine it before people see it; they call it “pre-bunking.” (Scott McIntyre/The New York Times)
In the fight against online misinformation, falsehoods have key advantages: They crop up fast and spread at the speed of electrons, and there is a lag period before fact-checkers can debunk them.
So researchers at Google, the University of Cambridge and the University of Bristol tested a different approach that tries to undermine misinformation before people see it. They call it “pre-bunking.”
The researchers found that psychologically “inoculating” internet users against lies and conspiracy theories — by preemptively showing them videos about the tactics behind misinformation — made people more skeptical of falsehoods afterward, according to an academic paper published in the journal Science Advances on Wednesday. But effective educational tools still may not be enough to reach people with hardened political beliefs, the researchers found.
Since Russia spread disinformation on Facebook during the 2016 election, major technology companies have struggled to balance concerns about censorship with fighting online lies and conspiracy theories. Despite an array of attempts by the companies to address the problem, it is still largely up to users to differentiate between fact and fiction.
The strategies and tools being deployed during the midterm vote in the U.S. this year by Facebook, TikTok and other companies often resemble tactics developed to deal with misinformation in past elections: partnerships with fact-checking groups, warning labels, portals with vetted explainers as well as post removal and user bans.
Social media platforms have made attempts to pre-bunk before, though those efforts have done little to slow the spread of false information. Most have also not been as detailed — or as entertaining — as the videos used in the studies by the researchers.
Twitter said this month it would try to “enable healthy civic conversation” during the midterm elections in part by reviving pop-up warnings, which it used during the 2020 election. Warnings, written in multiple languages, will appear as prompts placed atop users’ feeds and in searches for certain topics.
The new paper details seven experiments with almost 30,000 total participants. The researchers bought YouTube ad space to show users in the United States 90-second animated videos aiming to teach them about propaganda tropes and manipulation techniques. One million adults watched one of the ads for 30 seconds or longer.
The users were taught about tactics such as scapegoating and deliberate incoherence, or the use of conflicting explanations to assert that something is true, so that they could spot lies. Researchers tested some participants within 24 hours of seeing a pre-bunk video, and found a 5% increase in their ability to recognize misinformation techniques.
One video opens with a mournful piano tune and a little girl grasping a teddy bear, as a narrator says “what happens next will make you tear up.” Then, the narrator explains that emotional content compels people to pay more attention than they otherwise would, and that fearmongering and appeals to outrage are keys to spreading moral and political ideas on social media.
The video offers examples, such as headlines that describe a “horrific” accident instead of a “serious” one, before reminding viewers that if something they see makes them angry, “someone may be pulling your strings.”
Beth Goldberg, one of the paper’s authors and the head of research and development at Jigsaw, a technology incubator within Google, said in an interview that pre-bunking leans into people’s innate desire to not be duped.
“This is one of the few misinformation interventions that I’ve seen at least that has worked not just across the conspiratorial spectrum, but across the political spectrum,” Goldberg said.
Groups focused on information literacy and fact-checking have employed various pre-bunking strategies, such as a misinformation-identifying curriculum delivered over two weeks of texts, or lists of bullet points with tips such as “identify the author” and “check your biases.” Online games with names like Cranky Uncle, Harmony Square, Troll Factory and Go Viral try to build players’ cognitive resistance to bot armies, emotional manipulation, science denial and vaccine falsehoods.
Tech companies, academics and nongovernmental organizations fighting misinformation have the disadvantage of never knowing what lie will spread next. But professor Stephan Lewandowsky of the University of Bristol, a co-author of Wednesday’s paper, said propaganda and lies were predictable, nearly always created from the same playbook.
“Fact-checkers can only rebut a fraction of the falsehoods circulating online,” Lewandowsky said in a statement. “We need to teach people to recognize the misinformation playbook, so they understand when they are being misled.”
This article originally appeared in The New York Times.
We provide a free service for you to honor your loved ones. Click below to get started.
Success! An email has been sent to with a link to confirm list signup.
Error! There was an error processing your request.
Have the latest local news delivered every afternoon so you don’t miss out.
Receive our newspaper electronically with the e-edition email.
SF Sports, politics and culture
Sneak peek of the Examiner RE section
Sorry, an error occurred.
Have the latest local news delivered every afternoon so you don’t miss out.
Receive our newspaper electronically with the e-edition email.
Receive occasional local offers from our website and its advertisers.
Sneak peek of the Examiner RE section
SF Sports, politics and culture
Thank you .
Your account has been registered, and you are now logged in.
Check your email for details.
Invalid password or account does not exist
Submitting this form below will send a message to your email with a link to change your password.
An email message containing instructions on how to reset your password has been sent to the e-mail address listed on your account.
Secure & Encrypted
Secure transaction. Cancel anytime.
Thank you.
Your purchase was successful, and you are now logged in.
A receipt was sent to your email.