Canadian government’s online harm bill threatens our human rights

0

This column is the opinion of Ilan Kogan, a Canadian JD / MBA student at Yale Law School and Harvard Business School. For more information on CBC Opinion Section, please consult the Faq.

The Canadian government is considering new rules regulate how social media platforms moderate potentially harmful user-generated content. Already, the proposed legislation has been criticized by Internet scholars – across the political spectrum – as one of the worst in the world.

Oddly enough, the bill reads like a list of the most widely condemned political ideas in the world. Elsewhere, these ideas have been vigorously protested by human rights organizations and declared unconstitutional. There is no doubt that the federal government’s bill poses a serious threat to human rights in Canada.

The government’s intentions are noble. The aim of the legislation is to reduce five types of harmful content online: child sexual exploitation content, terrorist content, content inciting violence, hate speech and non-consensual image sharing. intimate.

Even though this content is already largely illegal, further reducing its proliferation is a laudable goal. Governments around the world, and especially in Europe, have introduced legislation to combat these harms. The problem is not the government’s intention. The problem is the government’s solution.

Serious confidentiality issues

The legislation is simple. First, online platforms would be required to proactively monitor all user speech and assess their potential for harm. Online communication service providers should take “all reasonable steps”, including the use of automated systems, to identify harmful content and restrict its visibility.

Second, any individual would be able to report content as harmful. The social media platform would then have 24 hours from the initial report to assess whether the content was in fact harmful. Failure to remove harmful content within that time frame could result in a hefty penalty: up to three percent of the service provider’s gross worldwide revenue or $ 10 million, whichever is greater. For Facebook, that would represent a penalty of $ 2.6 billion per post.

Proactive monitoring of user speech presents serious privacy concerns. Without restrictions on proactive surveillance, national governments would be able to significantly increase their surveillance powers.

The Canadian Charter of Rights and Freedoms protects all Canadians from unreasonable search. But under the proposed legislation, a reasonable suspicion of illegal activity would not be necessary for a service provider, acting on behalf of the government, to conduct a search. All content posted online would be searched. Potentially harmful content would be stored by the service provider and passed – in secret – to the government for criminal prosecution.

Under the bill, many innocent Canadians will face criminal charges, writes Ilan Kogan. (Trevor Brine / CBC)

Canadians who have nothing to hide still have something to fear. Social media platforms process billions of content every day. Proactive monitoring is only possible with an automated system. Yet automated systems are notoriously inaccurate. Even Facebook Manual the accuracy of content moderation has been reported be less than 90 percent.

Social media companies are not like newspapers; precise examination of each piece of content is operationally impossible. The result is uncomfortable: many innocent Canadians will face criminal charges under the bill.

But it is worse. If an online communications service provider determined that your content was not harmful during the tight 24 hour review period and the government later decides otherwise, the provider could lose up to three percent. of its worldwide gross income. As a result, any rational platform would censor far more content than strictly illegal content. Human rights specialists call this disturbing phenomenon “collateral censorship”.

Identifying illegal content is difficult and the risk of collateral censorship is therefore high. The restrictions on hate speech are perhaps the best illustration of the problem. The proposal expects platforms to apply Supreme Court of Canada jurisprudence on hate speech. Identifying hate speech is difficult for the courts, let alone poorly paid algorithms or content moderators who have to make decisions in seconds. While speech that only offends isn’t hate speech, platforms are likely to remove anything that has the slightest potential for inconvenience. Ironically, the minority groups that the law seeks to protect are among the most likely to be injured. That is why so many Canadian anti-racist groups opposed the legislation.

We must demand better

So what to do with online mischief? A step in the right direction is to recognize that not all harmful content is the same. For example, it is much easier to identify child pornography than hate speech. Therefore, the removal times for the former should be shorter than for the latter.

And while revenge pornography may be appropriate for removal only at the request of a victim, offensive speech may require input from the poster and an independent body or tribunal before removal is required. by the law. Other jurisdictions make distinctions. Canada should too.

The regulation of online harm is a serious issue that the Canadian government, like all others, must address to protect its citizens. Child pornography, terrorist content, incitement, hate speech and vengeful pornography have no place in Canada. More can be done to limit their prevalence online.

But the bill creates many more problems than it solves. It reads like a collection of the worst political ideas introduced to the world over the past decade. No other liberal democracy has been willing to accept these restrictions.

The threats to privacy and freedom of expression are obvious. Canadians must demand better.


Do you have a strong opinion that could shed light, shed light on an issue in the news, or change the way people think about an issue? We want to hear from you. here is how to pitch us.

Leave A Reply

Your email address will not be published.