EU Investigating Musk’s X Platform Over Handling of Israel–Hamas War Content

EU Investigating Musk’s X Platform Over Handling of Israel–Hamas War Content
Computer monitors and a laptop display the sign-in page of X, formerly known as Twitter, in Belgrade, Serbia, on July 24, 2023. (Darko Vojinovic/AP Photo)
Ryan Morgan
12/18/2023
Updated:
12/18/2023
0:00

The European Union is investigating Elon Musk’s online platform X, formerly known as Twitter, over its content moderation practices amid the ongoing Israel–Hamas conflict.

The European Commission said on Dec. 18 that it’s specifically investigating whether X’s content moderation practices violate the EU’s Digital Services Act (DSA). The investigation will focus on whether X allowed content that is illegal under the DSA to be shared “in the context of Hamas’ terrorist attacks against Israel.”

Hamas terrorists launched an unprovoked surprise attack from Gaza on Israel on Oct. 7, killed more than a thousand people, and took hundreds more hostage.

Under existing EU laws, there are a number of categories of illegal online content, such as content that incites or otherwise contributes to terrorism. Illegal content under EU laws can also include hate speech and incitement to violence. The European Commission didn’t specify exactly what types of illegal content appeared on X following the Hamas attacks that might constitute illegal online content, but it’s known that images of the attacks were captured and shared on the platform.

The European Commission said its investigation also will delve into whether X’s community notes feature has been an effective tool in combating information manipulation on the platform.

Margrethe Vestager, the executive vice president of the European Commission for A Europe Fit for the Digital Age, said the commission has enough evidence “to formally open a proceeding against X.”

In April, the European Commission named X one of 19 “very large online platforms” (VLOPs) that would need to comply with the new EU law, and compliance requirements went into effect in August.

“The higher the risk large platforms pose to our society, the more specific the requirements of the Digital Services Act are. We take any breach of our rules very seriously,” Ms. Vestager said.

The investigation marks the first time the European Commission has initiated investigative proceedings under the DSA, which was enacted in October of 2022.

“Today’s opening of formal proceedings against X makes it clear that, with the DSA, the time of big online platforms behaving like they are ’too big to care' has come to an end,” Commissioner Thierry Breton said. “We now have clear rules, ex-ante obligations, strong oversight, speedy enforcement, and deterrent sanctions and we will make full use of our toolbox to protect our citizens and democracies.”

Musk Has Faced Past Warnings

The European Commission’s decision to start the investigation into X comes as the platform’s owner, Mr. Musk, has pushed back on the EU’s content moderation requests.
In May, he withdrew X from the EU’s Code of Practice on Disinformation, which is a set of voluntary EU-prescribed content moderation practices. The Code of Practice entails 44 commitments and 128 specific measures, including calls for platforms to share data with disinformation researchers and to demonetize site users accused of spreading disinformation.

Following Mr. Musk’s decision to pull X from the EU disinformation agreement, Mr. Breton wrote a warning post on X that the platform still must follow EU content moderation policies.

“You can run but you can’t hide,” Mr. Breton’s May 26 warning states. “Beyond voluntary commitments, fighting disinformation will be legal obligation under #DSA as of August 25. Our teams will be ready for enforcement.”

On Oct. 10, Mr. Breton once again called on Mr. Musk to account for his platform’s content moderation policies.

“You need to be very transparent and clear on what content is permitted under your terms and consistently and diligently enforce your own policies,” Mr. Breton’s Oct. 10 letter to Mr. Musk reads. “This is particularly relevant when it comes to violent and terrorist content that appears to circulate on your platform. Your latest changes in public interest policies that occurred overnight left many European users uncertain.”

X CEO Linda Yaccarino responded to Mr. Breton in an Oct. 11 letter, insisting that X has taken actions to stop illegal content from spreading on the platform. As of Oct. 14, Ms. Yaccarino said X had identified and suspended hundreds of Hamas-affiliated accounts, handled more than 80 law enforcement requests to remove content, and applied its community notes fact-checking feature to more than 700 unique posts and thousands of reposts related to the Oct. 7 attacks.

“X is committed to serving the public conversation, especially in critical moments like this, and understands the importance of addressing any illegal content that may be disseminated through the platform,” Ms. Yaccarino wrote in October. “There is no place on X for terrorist organizations or violent extremist groups and we continue to remove such accounts in real time, including proactive efforts.”