Peter Menzies: New Online Harms Act Gives Appointed Commissioners Too Much Power

Peter Menzies: New Online Harms Act Gives Appointed Commissioners Too Much Power
Arif Virani, Minister of Justice and Attorney General of Canada, holds a press conference regarding the new online harms bill on Parliament Hill in Ottawa in this Feb. 26, 2024 file photo. (The Canadian Press/Sean Kilpatrick)
Peter Menzies
2/27/2024
Updated:
2/27/2024
0:00
Commentary

Canada has launched  legislation reining in social media and reducing its citizens’ freedom to express themselves online.

And while supporters of the Online Harms Act (Bill C-63) believe tighter control of speech and images by government is necessary to make platforms such as X and Facebook “safer,” it’s unclear if that will be the case.

The new bill follows passage last year of the Online Streaming Act (Bill C-11) that put the Canadian Radio-television and Telecommunications Commission in charge of all online audio and video content (rules still to come) and of the Online News Act that attempted to force Meta and Google to subsidize news organizations but backfired.

It will take weeks to examine the ambitions of the new bill, tabled Feb. 26, and weigh its tradeoffs, but here are some of the highlights and initial takeaways.

There will be a Digital Safety Commission led by a chair, vice-chair, and commissioners supported by a staff of public servants. Its job will be to oversee social media companies, each of which will have to satisfy the commission that it has policies and practices in place that protect users from seven distinct online harms.

Those are: sexually victimizing children, bullying, inducing children to harm themselves, extremism/terrorism, inciting violence, fomenting hatred, and sharing intimate content without consent, including deepfakes.

The platforms will have three “duties of care” imposed on them: to act responsibly, ensure content in those seven categories is inaccessible, and to otherwise protect children.

In addition, platforms will have to inform police if, while patrolling users’ content, they come across incidents of child sexual exploitation.

The good news is that just about everything this new five-person commission of cabinet appointees will be “imposing” is already covered in the Criminal Code and has been blocked or removed by the companies for years. And given that early drafts of the legislation envisioned a government commission empowered to directly patrol and order the removal of “lawful but awful” online content, the duty of care approach is a welcome relief that signals a significant retreat.

It’s also likely most people are fine with the idea that social media companies should  behave responsibly and ought to face consequences (fines) if they don’t.

In addition, the government is creating a Digital Safety Ombudsman (also a cabinet appointee) whose job will involve duties such as supporting victims of the online harms outlined, offering advice to the companies, and educating the public in navigating the social media landscape.

Seems a little heavy on the bureaucratic overkill if you ask me, but again, in terms of having a lot for the average person to worry about when exercising their rights in the public square, move along folks—not a lot to see here. Not much about your experience is likely to change, at least not at this stage, given that the behaviours demanded are already being performed.

But that doesn’t mean there’s nothing to worry about.

As internet expert and University of Ottawa law professor Michael Geist pointed out, the powers of the Digital Safety Commission are immense.

“It can issue rulings on making content inaccessible, conduct investigations, demand any information it wants from regulated services (and) hold hearings that under certain circumstances can be closed to the public,” Geist wrote.

“The Commission is not subject to any legal or technical rules of evidence, as the law speaks to acting informally and expeditiously, an approach that seems inconsistent with its many powers.”

Another legal expert, Halifax lawyer David Fraser, put it this way on X:

“‘I 100% expected it to be much worse’ doesn’t make it automatically good. Take a close note of the repeated use of the phrase “reasonable grounds to believe” and “suspect”, which set a very low bar and always err on the side of removal,” he wrote.

“The content must be removed or made inaccessible permanently if there are reasonable grounds to believe that there are reasonable grounds to suspect ... Not even actually believe or actually suspect.”

I don’t know about you, but that’s not the kind of unrestrained power I like to see in the hands of cabinet appointees. Lawyers for companies like Meta and X probably won’t like it either which means there’s a good chance they’ll advise their employers to err on the side of caution when it comes to censoring content. And that, dear reader, means you.

Alarming, in my view, is the Online Harms’ provision to define racist and homophobic comments as discrimination and give the Canadian Human Rights Commission (CHRC) the power to take complaints on that basis, levy fines up to $20,000 against those it deems guilty, and order them to remove their posts.

This stands a very good chance of flooding the human rights commission, which itself stands accused of being racist, with complaints from organizations and individuals seeking to embarrass and impoverish their ideological foes. Critical race theorists, after all, control much of public discourse and believe racism is embedded everywhere.

It is entirely conceivable that everything from religious texts to statements such as “a person with a penis cannot be a woman” will be subject to fines and takedown orders by the CHRC, where the usual rules of evidence don’t apply, guilt is the de facto default position, and the term “kangaroo court” is often applied.

Lastly, it was disappointing not to see one more duty imposed on X and Facebook, specifically the duty to preserve freedom of expression and apply their content moderation rules in an objective fashion, favouring neither progressives nor conservatives.

But, given the road we’re now going down, that’s probably not the government’s preferred outcome.

Views expressed in this article are opinions of the author and do not necessarily reflect the views of The Epoch Times.