Australia’s eSafety Commissioner Launches Legal Proceedings Against X

It follows a $600,000 fine imposed in September.
Australia’s eSafety Commissioner Launches Legal Proceedings Against X
A photo illustration of X (Twitter) logo in London, England, on July 24, 2023. (Dan Kitwood/Getty Images)
Alfred Bui
12/21/2023
Updated:
12/21/2023
0:00

The Australian eSafety Commissioner (eSafety) has commenced civil penalty proceedings against social media giant X, formerly known as Twitter, for failing to comply with government requirements regarding child sexual exploitation materials.

In February, eSafety issued a transparency notice to several social media companies, requesting them to provide information about how they were addressing child sexual exploitation and abuse materials and activities on their platforms.

The commission alleged that X did not comply with the notice by not preparing a report in the required manner and form.

In addition, eSafety said X did not respond or failed to respond truthfully and accurately to some questions in the notice.

For example, the company did not explain how much time it needed to respond to reports of child sexual exploitation, the measures it had implemented to detect child sexual exploitation in live streams, and the tools and technologies it used to detect those abusive materials.

X also inadequately disclosed the number of safety and public policy staff remaining after tech billionaire Elon Musk acquired the platform in October 2022 and implemented several rounds of job cuts.

While other social media platforms, including Google, did not respond well to the transparency notice, eSafety found X’s non-compliance very serious.

In September, eSafety issued a $610,500 ($US414,400) fine to X and gave the company 28 days to request the withdrawal of the infringement notice or pay the penalty.

X did not pay the fine nor requested a withdrawal, opting instead for a judicial review of eSafety’s transparency and infringement notices.

eSafety is now seeking to have the judicial review heard while commencing civil penalty proceedings against X at the same time.

Other Social Media Platforms Not Doing Enough

While X is the only tech company fined, eSafety found that other social media platforms also did not do well in tackling child sexual exploitation materials in Australia.
“Our first report featuring Apple, Meta, Microsoft, Skype, Snap, WhatsApp, and Omegle uncovered serious shortfalls in how these companies were tackling this issue,” eSafety Commissioner Julie Inman Grant said in an October statement.

“This latest report also reveals similar gaps in how these five tech companies are dealing with the problem and how they are tackling the rise in sexual extortion, and we need them all to do better.

“What we are talking about here are serious crimes playing out on these platforms committed by predatory adults against innocent children, and the community expects every tech company to be taking meaningful action.”

Among the platforms, Discord did not take any measures to detect child sexual exploitation in live streams, citing “prohibitively expensive” costs.

The company also did not use any language analysis technology to detect child sexual abuse activities, such as sexual extortion, across its services.

Google only used such technology on YouTube, but not on Chat, Gmail, Meet, and Messages.

Furthermore, Google and Discord did not block links to known child sexual exploitation materials nor used technology to detect grooming in some or all of their services.

Alfred Bui is an Australian reporter based in Melbourne and focuses on local and business news. He is a former small business owner and has two master’s degrees in business and business law. Contact him at [email protected].
Related Topics