Supreme Court Agrees to Hear Cases About YouTube, Twitter Allegedly Facilitating Terrorist Groups

Supreme Court Agrees to Hear Cases About YouTube, Twitter Allegedly Facilitating Terrorist Groups
General view of the YouTube Artist Lounge at Coachella 2022 at Empire Polo Club in Indio, Calif., on April 15, 2022. (Anna Webber/Getty Images for YouTube)
Matthew Vadum

The Supreme Court has agreed to hear two cases about the extent to which social media platforms may be held responsible when terrorist groups use the platforms to promote their cause.

Social media platforms such as YouTube and Twitter say they shouldn’t be held responsible if terrorists use their websites.

Section 230 of the federal Communications Decency Act of 1996 generally prevents internet platforms and internet service providers from being held liable for what users say on them. This created “a broad protection that has allowed innovation and free speech online to flourish,” according to the Electronic Frontier Foundation.

The first case is Gonzalez v. Google LLC, court file 21-1333, an appeal from the U.S. Court of Appeals for the 9th Circuit. The high court agreed to hear the case in an unsigned order dated Oct. 3.

The family of student Nohemi Gonzalez, a U.S. citizen who was killed in an ISIS attack on a bistro in Paris in November 2015, sued, claiming that Google, owner of YouTube, was liable under the Antiterrorism Act for aiding ISIS recruitment efforts by using algorithms to steer users to ISIS videos. The killing was part of a larger series of attacks the terrorist group carried out in Paris at that time. Gonzalez was one of 129 people killed during the terrorist campaign.

The “plaintiffs asserted that Google had knowingly permitted ISIS to post on YouTube hundreds of radicalizing videos inciting violence and recruiting potential supporters to join the ISIS forces then terrorizing a large area of the Middle East, and to conduct terrorist attacks in their home countries,” according to the petition (pdf) filed with the high court.

Because of the algorithm-based recommendations, users “were able to locate other videos and accounts related to ISIS even if they did not know the correct identifier or if the original YouTube account had been replaced.”

Google’s services “played a uniquely essential role in the development of ISIS’s image, its success in recruiting members from around the world, and its ability to carry out attacks.” The complaint added that “Google officials were well aware that the company’s services were assisting ISIS.”

A divided 9th Circuit panel found that under Section 230 the viewing recommendations were protected by federal law even if the section “shelters more activity than Congress envisioned it would.”

Google denied liability in a court filing, saying that it's impossible for it to review every video that gets posted to YouTube, which accepts more than 500 hours of new content every minute.

Justice Clarence Thomas has frequently suggested that the Supreme Court should revisit the reach of Section 230.

The second case is Twitter Inc. v. Taamneh, court file 21-1496, which the Supreme Court also agreed on Oct. 3 to hear.

Twitter asked the high court to review (pdf) a lower court ruling regarding a lawsuit filed against the micro-blogging website by the family of a Jordanian national killed in an ISIS attack in an Istanbul nightclub.

Twitter argues it shouldn't be held responsible for acts of international terrorism if ISIS used its platform.

Twitter filed what it called “a protective, conditional petition relating to Gonzalez,” urging the court not to take up the Gonzalez case. At the same time, the company said if the court agrees to hear Gonzalez it should also hear Twitter’s case.

Twitter is engaged in a separate, extended legal battle with magnate Elon Musk over his $44 billion takeover bid for the company. Conservatives say Twitter discriminates against them and censors their views.