The Complications of Criminalizing Speech That May Spread Hate

June 28, 2021 Updated: June 28, 2021

Commentary

Speech in Canada has long had reasonable limits placed upon it. It is illegal to incite violence against a person or group. One can’t encourage others to commit a criminal act nor, as the old cliche goes, can a person shout “fire” in a crowded theatre with impunity. When the path from speech to direct harm can be clearly established, people can be held criminally responsible. When we try to criminalize speech that may spread hate, things become complicated.

As the spring parliamentary session was just about to come to a close, the Liberal government introduced Bill C-36. The intent of the bill is to amend both the Criminal Code and the Human Rights Act to include provisions for online hate speech. The Liberals have made it clear that they want to regulate online speech through C-10 and C-36, both of which are currently before the Senate.

It is looking likely that there will be a federal election this fall and if so these bills will die on the order paper, but that doesn’t mean such legislative efforts to control speech will disappear. It means they will turn into campaign planks.

Defending free speech can be a dicey business, particularly in an election period. Proponents of unfettered speech are often unfairly accused of supporting the propagation of hate itself. Defending a person’s right to say something offensive is not the same as supporting what was actually said, but that line is often blurred when political brinksmanship is involved. This puts supporters of free speech on the defensive and makes them reluctant to even go into the issue. That makes election periods the worst time to go into an issue as loaded and nuanced as the regulation of speech.

Bill C-36 is essentially a reincarnation of what was once Section 13 of the Canadian Human Rights Act. Section 13 used the term “likely to expose a person or persons to hatred or contempt” when referring to hate speech. The ambiguity and subjectivity of that definition led to numerous actions against journalists and publications, including Maclean’s magazine, for having published content that some felt was hateful. While all of the charges against Maclean’s were dismissed eventually, it was costly for the magazine and the chilling effect upon journalists and publications was undeniable.

Section 13 of the Human Rights Act was repealed by the Harper government in 2008. It had become clear that the definition of hate was simply too subjective to apply to speech restrictions. Accusations were common but convictions were rare. The legislation was untenable. Bill C-36 uses the term “likely to foment detestation or vilification of an individual or group of individuals on the basis of a prohibited ground of discrimination.” If anything, C-36 is more ambiguous in its definition of hate than Section 13 had been. It would be no more enforceable in practice, though it certainly could have a chilling effect upon speech. The use of the term “likely” within the legislation blows a hole in it that no reasonable judge could ignore.

When a person has crossed the line from offensive content into harassment, we already have the legal means to deal with them. In May 2019, controversial former Calgary mayoral candidate Kevin Johnston was ordered to pay $2.5 million by a civil court for his hateful harassment campaign against a Muslim man in Ontario. Johnston has since been incarcerated for posting online threats against health services workers in Alberta. We didn’t need new legislation in order to deal with Johnston, nor will we need it when others like him surface online.

Modern society is ever evolving, and for the better. Public expressions of hatred against people for their race, sexual orientation, or religion are not socially tolerated. Those who are insistent upon displaying bigotry toward identifiable groups will quickly find themselves ostracized among their peers. Social pressures are far more effective than legal threats in battling hatred. Yes, there are still some vocal and hateful people out there, and there likely always will be some. Those kinds of people are in a dwindling minority though, and that is not due to fears of legal repercussions.

Hateful, offensive speech is abundant on the internet. Never before has there been such an ability to anonymously publish vile content to a potentially large number of viewers. Many odious people are using online platforms with the intention of spreading hatred of groups and people. Should we try to regulate this, though, and even if we wanted to, can we?

Regulation of hateful speech will be up for discussion this fall, whether in Parliament or in an election campaign. Let’s try to keep the discussion thoughtful and rational on this issue, as difficult as that may be. We can’t let theatre and political posturing impact something as critically important as free speech.

Cory Morgan is a columnist and business owner based in Calgary.

Views expressed in this article are the opinions of the author and do not necessarily reflect the views of The Epoch Times.

Cory Morgan
Cory Morgan
Cory Morgan is a columnist and business owner based in Calgary, Alberta.