Sweeping new powers are being granted to Australia’s eSafety Commissioner to monitor and takedown “seriously harmful content”, including those related to cyberbullying and revenge porn.
However, lawyers, civil rights groups, and tech giant Google have expressed concerns about the new law’s scope and possible infringements on civil liberties.
Introduced into Parliament last week, the Online Safety Bill grants the Commissioner authority to address headline issues such as cyberbullying, toxic online abuse, child sexual abuse, image-based abuse (revenge porn), and any other content deemed harmful. It also establishes minimum online safety guidelines for the industry.
“When people interact in person, they take for granted that the rule of law applies. People should be able to expect the same when they interact online,” Communications Minister Paul Fletcher said in a statement.
The Commissioner will also have the power to compel internet service providers to block access to content deemed harmful.
Further, app stores and search engines will need to remove software or links that enable such content proliferation.
Online service providers will initially receive a notice from the Commissioner and will have 24 hours to activate it or risk penalties.
The Commissioner will also be granted a rapid takedown power for “crisis events,” such as the 2019 Christchurch terrorist attack that was live-streamed on Facebook.
Content blockages by the Commissioner can last up to three months under the law.
Individuals responsible for producing and uploading the content will not be able to hide behind anonymity either. The new law compels companies behind social media, electronic services (messaging apps), and internet providers to hand over end-users’ contact details.
Matt Warren, professor of cybersecurity at the Royal Melbourne Institute of Technology, supports the law, saying it is a “key step” to dealing with serious social issues, such as revenge porn.
“The government has a ‘duty of care to protect Australia and Australian citizens,” he told The Epoch Times.
“At the moment, the Commissioner does not have these the powers and is disadvantaged when it comes to protecting Australians,” he added.
The Commissioner will rely on the National Classification Code to determine what content is harmful.
The Code has been used to classify films, computer games, and publications and includes content that deals with sex, “revolting or abhorrent phenomena,” or content unsuitable for minors.
Digital Rights Watch, a non-profit group focused on educating Australians on their digital rights, warned that sex workers or sexual education material could be put at risk.
“Some (internet service) platforms will default to blanket removal of all sexual content to avoid penalty rather than deal with the harder task of determining which content is actually harmful,” according to a submission by the group.
Lawyer Graham Droppert SC warned the Code was “overly broad” and could capture more content than intended.
“In its current form, the Bill centralises power (in the eSafety Commissioner) without due process and opens up the ability for the Commissioner to remove public interest material,” Droppert of the Australian Lawyers Alliance said in a statement.
“Many elements of the Bill address important issues in relation to online safety, but it is critical that the Bill includes an effective, accessible internal review process so that people can challenge removal notices in a timely manner, ” he added.
Droppert suggested more stakeholders be involved in reviewing decisions for blocking content.
Google Australia said the 24-hour takedown timeframe should be more flexible.
Further, it said the cross-platform reach of the new law would be difficult to enforce, particularly around cloud services.
“The cloud provider typically does not have visibility into its customers’ content to meet the privacy, security, and regulatory demands of its customers,” Google said in a submission (pdf).
“Even if something was flagged by an external observer, it is often impossible for a cloud provider to remove individual pieces of content.”