Meta’s Threads to Roll Out 3rd-Party Fact-Checking Ahead of 2024 Election

The Mark Zuckerberg-headed company already uses third-party fact-checkers to moderate content shared on Facebook and Instagram.
Meta’s Threads to Roll Out 3rd-Party Fact-Checking Ahead of 2024 Election
The Threads logo is displayed on a cell phone in San Anselmo, Calif., on July 5, 2023. (Justin Sullivan/Getty Images)
Katabella Roberts
12/14/2023
Updated:
12/15/2023
0:00

Meta will begin rolling out a fact-checking program for its Threads app, a rival to the X platform, ahead of the U.S. presidential election in 2024 as part of efforts to crack down on “false content.”

The Mark Zuckerberg-led company announced the plan in a blog post published on Dec. 12.

Threads will use third-party fact-checkers to flag and review user-generated content on the social media platform beginning early next year, the company said.

Meta already uses third-party fact-checkers to moderate content shared on Facebook and Instagram.

“Early next year, our third-party fact-checking partners will be able to review and rate false content on Threads,” Meta said in the update.

“Currently, when a fact-checker rates a piece of content as false on Facebook or Instagram, we extend that fact-check rating to near-identical content on Threads, but fact-checkers cannot rate Threads content on its own,” the post added.

Meta also said that it recently began allowing Facebook and Instagram users to select how much sensitive or fact-checked content they wanted to see in their feeds on the social networks if they are based in the United States.

The firm plans to implement the same settings for U.S.-based Threads users.

“We recently gave Instagram and Facebook users more controls, allowing them to decide how much sensitive or, if they’re in the US, how much-fact-checked content they see on each app,” the Menlo Park, California-based company said in the blog post.

“Consistent with that approach, we’re also bringing these controls to Threads to give people in the US the ability to choose whether they want to increase, lower, or maintain the default level of demotions on fact-checked content in their Feed. If they choose to see less sensitive content on Instagram, that setting will also be applied on Threads. ”

Instagram head Adam Mosseri also stated in a Dec. 13 post that the social media giant is working to extend its fact-checking program on Threads in 2024.

Meta Blocks Some Keyword Searches

“We currently match fact-check ratings from Facebook or Instagram to Threads, but our goal is for fact-checking partners to have the ability to review and rate misinformation on the app,” Mr. Mosseri wrote. “More to come soon.”
Under the new fact-checking program, users will have three levels of controls: “Don’t reduce,” “Reduce,” and “Reduce more,” according to The Verge.

Those controls will impact the ranking of various posts on the platform if they are “found to contain false or partly false information, altered content, or missing context.”

Threads has “just under” 100 million monthly users, according to Mr. Zuckerberg.
The new platform was released to the public in July, and Meta quickly rolled out a series of updates, including a search function similar to that on X, but which blocks searches for keywords including “COVID-19” and “COVID” and those concerning vaccines and long COVID.

At the time, Meta said the block was temporary and aimed at preventing “potentially sensitive content” from appearing on the platform, however, the move led to criticism from public health experts who accused the platform of censorship.

In October, Mr. Mosseri said that while Threads isn’t “anti-news,” Meta also has no plans to “amplify news on the platform.”

“To do so would be too risky given the maturity of the platform, the downsides of over-promising, and the stakes,” he wrote on the platform.
Last month, Meta unveiled what it called a “comprehensive approach for elections” on its platforms, and said it will block new political, electoral, and social issue advertisements during the final week of the U.S. election campaign, as it has done in previous years.

Additionally, beginning next year, Meta will require advertisers to disclose when they use artificial intelligence or other digital techniques to “create or alter a political or social issue ad in certain cases.”

The company said it has taken down more than 200 “malicious influence campaigns” involved in what it calls “coordinated inauthentic behavior” and has designated more than 700 hate groups around the world as part of its effort to combat the spread of election misinformation and interference.