Australia’s online safety watchdog has required social media giants to explain how they are tackling terrorist and violent extremist materials on their platforms.
Those companies will have to answer a series of detailed questions about how they are dealing with the issue.
eSafety said authorities in Australia and other countries were concerned about the role of violent extremist materials in some terror attacks, such as the Christchurch mosque shootings in 2019 and the murder of 10 Black Americans in New York in 2022.
eSafety Commissioner Julie Inman Grant said online users had been reporting that perpetrator-produced material from terror attacks continued to be reshared on mainstream social media apps.
Ms. Inman Grant also noted that there were rising concerns about terrorists and violent extremists taking advantage of the emerging generative AI technology to find new ways to cause harm.
“Earlier this month the U.N.-backed Tech against Terrorism reported that it had identified users of an Islamic State forum comparing the attributes of Google’s Gemini, ChatGPT, and Microsoft’s Copilot,” she said.
“The tech companies that provide these services have a responsibility to ensure that these features and their services cannot be exploited to perpetrate such harm.”
Why the Six Companies Were Chosen
The commissioner also explained the rationale behind her decision the companies.WhatsApp came 8th in the ranking, while there was evidence that Reddit had a role in the radicalisation of the murderer in the New York shootings.
“It’s no coincidence we have chosen these companies to send notices to, as there is evidence that their services are exploited by terrorists and violent extremists. We want to know why this is and what they are doing to tackle the issue,” Ms. Inman Grant said.
“Transparency and accountability are essential for ensuring the online industry is meeting the community’s expectations by protecting their users from these harms.
“Also, understanding proactive steps being taken by platforms to effectively combat terrorist and violent extremist content is in the public and national interest.”
At the same time, Ms. Inman Grant said she was disappointed that none of the social media platforms provided the watchdog with information on the issue under the current voluntary framework, forcing eSafety to issue legal notices.
eSafety’s announcement was well received by opposition communications spokesman David Coleman, who believed self-regulation did not work.
“This kind of content is completely abhorrent and the fight against it must continue to be a top priority,” he said.
“The big digital platforms must absolutely be held accountable for the content they publish and profit from.”
The inquiry resulted in X being fined $610,000 (US$400,000) for failing to comply with eSafety’s requirements.