The framework proposes establishing a Digital Safety Commission of Canada that would include three bodies: a Digital Safety Commissioner, a Digital Recourse Council (DRC), and an advisory board.
Together, they would police what the proposal terms online communication service providers (OCSPs), such as Facebook, YouTube, TikTok, Instagram, Twitter, and Pornhub. The ostensible goal is to eliminate hate speech, terrorist content, content that incites violence, intimate images shared without consent of the participants, and child sexual exploitation.
OCSPs would be required to implement measures to proactively monitor for harmful content, including via automated systems. They would have to respond to complaints flagged by any user within 24 hours, including removing that content if deemed harmful.
In addition, OCSPs would have to meet reporting requirements, which have confidentiality restrictions in order to protect information such as that involving privacy, national security, or commercial interests. The restrictions would include allowing OCSPs to designate certain information as confidential, which would preclude them from notifying affected users.
Platforms that fail to comply could face fines of up to $10 million or 3 percent of their gross global revenue—whichever is higher—imposed by the Digital Safety Commissioner. Alternatively, the commissioner could refer offences to prosecutors, in which case the fines could be up to $25 million or 5 percent of the platform’s gross global revenue, whichever is higher.
The commissioner could also apply to the Federal Court to require telecommunications service providers to block or filter access to all or part of a service in Canada that has repeatedly refused to remove child sexual exploitation or terrorist content or both. The commissioner would also collect and share information with other government departments and agencies.
The proposal’s discussion guide calls for changes to the Canadian Security and Intelligence Service Act to give CSIS the ability to more quickly obtain the subscriber information of those involved in spreading “ideologically motivated violent extremist” content online.
The commissioner could even send inspectors into workplaces and homes to examine or acquire documents or other information of concern, including computer algorithms and software.
Section 35 of the proposal’s technical paper tasks the commissioner with “engaging with and considering the particular needs of and barriers faced by groups disproportionately affected by harmful online content such as women and girls, Indigenous Peoples, members of racialized communities and religious minorities and of LGBTQ2 and gender-diverse communities and persons with disabilities.”
The DRC would review complaints by people affected by OCSPs’ content moderation decisions and rule on whether the content is indeed harmful as defined in the legislation. The council would consist of three to five members appointed by the governor-in-council, who is to consider “the importance of diverse subject-matter experts” from the aforementioned minority groups in making appointments.
The commissioner and the DRC may conduct hearings in secret if this is deemed to be in the public interest, such as where there are concerns related to privacy, national security, international relations, national defence, or confidential commercial interests.
‘A Problem for Freedom of Expression’
Cara Zwibel of the Canadian Civil Liberties Association expressed concerns with the legislation.
“It’s got some things in it that we, of course, were hoping it would not. It’s got 24-hour takedown requirements. It allows for website blocking. So there’s a lot in there that we’re pretty concerned about and we think Canadians will be concerned about,” Zwibel said in an interview.
“The big issue with the proposal is that there’s a potential to interpret these things very broadly. And by creating these 24-hour takedown requirements, you’re incentivizing social media companies to err on the side of removal, which is obviously a problem for freedom of expression.”
Zwibel is also concerned that the task of dealing with such large volumes of content could create a bloated bureaucracy while ultimately not accomplishing its objective.
“This content just moves around. People try to get it taken down off this platform, it shows up on a different one. Try to get to take them off that one, it shows up on another one. So I’m not sure about the effectiveness of these tools,” she said.
“One of the most troubling things in the proposal has to do with the mandatory sharing of information between social media companies and law enforcement. … Co-opting of private companies as forms of law enforcement is a concerning development that we need to pay pretty close attention to.”
The proposal, for which the government intends to introduce a bill in the fall, is touted as the regulatory complement of Bill C-36. Put together, Lisa Bildy of the Justice Centre for Constitutional Freedoms (JCCF) sees trouble.
“This is, frankly, one of the most egregious attacks on the free society in living memory. It undermines the liberal legal order, which protects freedom of expression, the marketplace of ideas, constitutional neutrality, and important legal protections like the presumption of innocence,” Bildy told The Epoch Times.
She said the proposal dovetails with other “dangerous legislation” that the JCCF is already preparing to challenge as unconstitutional.
“They all appear to be related. Bill C-36 proposes to punish a much broader range of ‘hate speech’ in disturbing ways. And Bill C-10, with the help of the new digital safety bureaucracy, will ensure that it is pulled down from the internet immediately. The whole scheme treats freedom of expression as a threat to, rather than a feature of, a liberal democracy.”
Bill C-10, which went to the Senate for consideration after passing in the House in June, seeks to amend the Broadcasting Act to start regulating audio and audiovisual content delivered over the internet by digital platforms. It’s a controversial bill that critics fear may lead to everyday Canadians being censored for social media postings.
In a recent blog post, University of Ottawa law professor Michael Geist also condemned the legislation.
“Far from constituting a made-in-Canada approach, the government has patched together some of the worst from around the world,” Geist wrote.
“The government says it is taking comments until September 25th, but given the framing of the documents, it is clear that this is little more than a notification of the regulatory plans, not a genuine effort to craft solutions based on public feedback.”
Zwibel believes the expected federal election could provide another valuable opportunity for Canadians to weigh in.
“It maybe will be a topic of discussion, if there is an election, [as to] what we want to see happen with regulating the social media companies,” she said.
“There is an opportunity for Canadians to say, ‘This isn’t what we want. This isn’t something we think will be effective,’ or ‘We think it will have dangerous consequences.’”