Internet an ‘Agent of Democratic Erosion’: Call for Ordinary Citizens to Moderate Online Content

The Lowy Institute is concerned the power of online content is in the hands of just a few big tech companies.
Internet an ‘Agent of Democratic Erosion’: Call for Ordinary Citizens to Moderate Online Content
metamorworks/Shutterstock
Monica O’Shea
Updated:
0:00

The Lowy Institute is recommending that a council of regular citizens and tech experts work together to moderate online content.

In an executive summary, the Lowy Institute raised concerns most digital platforms in democracies are presided over by a “handful of multi-national corporations.”

The author, researcher Lydia Khalil, suggested when power is in the hands of just a few, there is minimal accountability to the population. Instead, she suggested ordinary citizens should have a role in making regulatory decisions.

The report (pdf), published on Feb. 21, was funded by the New South Wales government as part of a Digital Threats to Democracy Project.

“The internet was once considered an open door to democracy and liberty. Today, it is seen as an agent of democratic erosion,” Ms. Khalil wrote.

“Digital challenges to democracy include the scale and spread of disinformation and misinformation, the increase in polarisation and extremism that are facilitated and escalated online, and inadequate regulation.”

The report suggests platform councils made up of average digital users and tech experts could help “achieve more legitimate consensus on the uses and governance of digital platforms.”

This would enable content moderation and user access responsibility and risk to be shared among technology companies, government, and the population, the report notes.

“It has never been satisfactory that a handful of tech CEOs can set the rules and norms for so much of the world’s communication and expression,” the report states.

“Attempts by governments (heavily influenced by partisan pressures) to define and regulate ‘disinformation’ or extreme or harmful speech through law, which these private companies will then have to reflect, have also been fraught and problematic, as regulation of disinformation in the name of online safety runs into democratic rights such as freedom of expression.

“This is where platform councils could play a role in better reflecting citizens’ views.”

The author notes a similar process could be used to inform government regulation on AI and other new technologies that could be a threat to democracy.

“Technocratic solutions and input are not enough. Ordinary citizens must be provided the opportunity to contribute to regulatory decisions. Where piloted, digital deliberative democracy has proven to be legitimate and popular,” she said.

Musk Avoids Paying Australian Government Fine

Amid the release of the Lowly report, the federal government is attempting to crack down on social media companies including X.

The Australian eSafety Commissioner, Julie Inman Grant, issued a fine to Elon Musk’s X in September 2023 over allegedly failing to crack down on child sexual abuse materials online.

However, X has yet to pay the $610,000 (US$399,500) fine, a Senate estimates hearings on Feb. 13 revealed. The company instead applied to apply for a judicial review of eSafety’s transparency and infringement notices.

Under questioning from Greens Senator David Shoebridge, eSafety regulatory operations manager Toby Dagg revealed the commission itself cannot take any enforcement options against X. However, it has filed a civil penalty application to the Federal Court.

“It’s up to the court then to determine the quantum of any penalty that comes down,” he said.
Ms. Grant added, “enforcement is going to be an issue that every online safety regulator deals with.”

Australian Government Proposed Misinformation and Disinformation Laws

The report comes amid hot debate about misinformation and disinformation and legislation in Australia.

The Albanese government held a public consultation on a proposed misinformation/disinformation law from June 24 to Aug. 20. This consultation received 23,000 submissions and the majority were opposed.

The Communications and Legislation Amendment (Combatting Misinformation and Disinformation) Bill (pdf) could provide government bureaucrats with the power to patrol and enforce misinformation online, including issuing fines to social media companies.

While the government was initially planning to bring the law to the parliament in late 2023, it is now delaying the introduction until 2024.

Communications Minister Michelle Rowland is now considering refinements to the law, including possibly protections for religious expression.

“The government is considering refinements to the bill, including to definitions, exemptions, and clarification on religious freedom, among other things,” she said in November.

“In the face of seriously harmful content that sows division, undermines support for pillars of our democracy, or disrupts public health responses, doing nothing is not an option.”

The Opposition is “fundamentally opposed” to the bill, Shadow Communications Minister David Coleman expressed in November.

“Fines of $9,000 per day can apply if people don’t answer allegations of misinformation. What sort of government would do that?” he said.

“The government has begun the process of walking that back, of delaying the bill, of taking provisions out of the bill because it is, frankly, one of the worst pieces of legislation ever put before this parliament.

“That was a judgement of the minister because the minister published that legislation, and you don’t publish legislation because you think it’s a bad idea; you publish legislation because you think it’s a good idea.”

Monica O’Shea
Monica O’Shea
Author
Monica O’Shea is a reporter based in Australia. She previously worked as a reporter for Motley Fool Australia, Daily Mail Australia, and Fairfax Regional Media.
Related Topics