Feds’ planned crackdown on harmful online content getting a revamp

The federal government is working with an expert panel to rework its promised online harms bill, after stakeholders identified numerous flaws that needed rectifying with their previous proposal.

Through the yet-to-be presented legislation, the government had signalled its intent to make “online communxjmtzywication service providers,” such as Facebook, YouTube, Twitter, Instagram, and TikTok more accountable for and transparent about, how they handle five kinds of harmful content on their platforms: hate speech, child exploitation, the sharing of non-consensual images, incitements to violence, and terrorism.

However, following the initial summer 2021 consultation window, stakeholders including civil society organizations, online industry stakeholders, and academics came forward raising red flags and expressing wide-spanning concerns with what then-Canadian heritage minister Steven Guilbeault had presented.

The government released its “What We Heard” report on Thursday, based on its assessment and contemplation of the feedback from the consultation process. It concluded that while the majority of respondents felt there is a need for the government to take action to crack down on harmful content online, given the complexity of the issue the coming legislation needs to be thoughtful in its approach to guard against “unintended consequences.”

What the Liberals were proposing included elements such as:

  • Implementing a 24-hour takedown requirement for content deemed harmful, and powers to block platforms that are repeat offenders;
  • Compelling platforms to provide data on their algorithms and other systems that scour for and flag potentially harmful content;
  • Obligations for sites to preserve content and identifying information for potential future legal action;
  • Levelling severe sanctions for companies deemed to be repeatedly non-compliant, including fines of up to $25 million;
  • Creating a new “Digital Safety Commission of Canada” that would be able to issue binding decisions for platforms to remove harmful content; and
  • Installing a new system for Canadians to appeal platforms’ decisions around content moderation.

According to the report, those who submitted feedback said that government needs to “reconsider its approach” to several key elements of the bill, in order to satisfy the outstanding concerns related to freedom of expression, privacy rights, the impact of the proposal on marginalized groups, and compliance with the Canadian Charter of Rights and Freedoms.

This includes reassessing the types of online services that would be regulated and what the threshold for inclusion would be; what obligations there will be on platforms to moderate, remove and report harmful content; and the strength of the accompanying independence and oversight bodies.

  • Capital Dispatch: Stay up to date on the latest news from Parliament Hill

“Respondents signaled the need to proceed with caution. Many emphasized that the approach Canada adopts to addressing online harms would serve as a benchmark for other governments acting in the same space and would contribute significantly to international norm setting,” reads the report.

Now, the government says that with the gaps identified, it will be working with a group of experts to advise the government on how to adjust its proposal, with the aim of bringing forward legislation “in the near future.”

“This work will be carried out in a transparent and expedited manner… the Government of Canada is committed to getting this right and to doing so as quickly as possible,” said Canadian Heritage in a statement accompanying the report.

“We are committed to ensuring that online platforms provide safe and respectful experiences for Canadians to engage and share information with one another. This is a very important and complex issue,” said Canadian Heritage Minister Pablo Rodriguez in the statement. He is working with Justice Minister David Lametti and Public Safety Minister Marco Mendicino on the coming bill.

Taking more time to consider the legislation means the Liberals will not be meeting their election pledge to move on the online harms bill within the first 100 days of their new mandate, as Feb. 3 marks day 100.