Should Facebook Messenger or Instagram be regulated?

OTTAWA—The advisory panel tasked with making recommendations for Canada’s pending legislation on online safety has failed to come to an agreement on how online harms should be defined, and whether dangerous content should be scrubbed from the internet altogether.

On Friday, the federal government published the findings from the expert panel’s tenth and final session, which summed up three months of deliberations over what a future legislative and regulatory framework might look like.

The 12-person panel convened experts on subjects like hate speech, terrorism, child sexual exploitation and regulating online platforms. Their conclusions come after Ottawa published a proposal for an online harms bill last summer, which prompted some stakeholders involved with consultations to urge the government to go back to the drawing board.

The findings highlight the steep challenges the federal government will face in introducing the legislation, which was meant to be introduced within 100 days of the Liberals forming government last fall.

Heritage Minister Pablo Rodriguez is now embarking on a series of regional and virtual round tables to gather more feedback on the framework, starting with the Atlantic Provinces.

Here’s what the experts — who were kept anonymous in the report — concluded.

What are ‘online harms’ anyway?

In its proposal last year, the government identified five types of “harmful content”: hate speech, terrorist content, incitement to violence, child sexual exploitation and non-consensual intimate images.

Most of the panel found that child exploitation and terrorist content should be handled in “an unambiguous manner by future legislation.” Others deemed the five categories “deeply problematic,” in one instance, taking issue with definitions of terrorism for focusing on “Islamic terror” and omitting other forms.

Rather than isolating specific kinds of harmful content, some experts suggested that “harm could be defined in a broader way, such as harm to a specific segment of the population, like children, seniors, or minority groups.” Panel members also disagreed on whether harms should be narrowly defined in legislation, with some arguing that dangerous content evolves and changes, while others said regulators and law enforcement will require tight definitions.

Disinformation, something Rodriguez has previously said must be tackled with “urgency,” also took up an entire session of the panel’s review. While deliberately misleading content was not listed as a category in the government’s proposal last year, disinformation emerged as a possible classification of online harms during last summer’s consultations.

The panel concluded that disinformation “is challenging to scope and define,” but agreed it led to serious consequences like inciting hatred and undermining democracy. Members ultimately argued that disinformation should not be defined in any legislation because it would “put the government in a position to distinguish between what is true and false — which it simply cannot do.”

Should harmful content be wiped from the internet?

Another key area experts could not agree on was whether the coming legislation should force platforms to remove certain content.

The debate stems from long-standing issues with the government’s prior suggestion that harmful content be removed within 24 hours of being flagged, and concerns over compromising free speech.

While experts appeared to agree that explicit calls for violence and child sexual exploitation content should be removed, some cautioned against scrubbing any content, while others “expressed a preference for over-removing content, rather than under-removing it.”

Experts diverged on what thresholds would constitute the removal of content, with some suggesting that harm could be classified one of two ways: either a “severe and criminal” category with the possibility of recourse, and a less severe category without the option for seeking recourse.

There was also disagreement on whether private communications, such as content sent via chat rooms, Facebook Messenger, or Twitter and Instagram direct messages would need to be regulated and removed. Some members said private services that harm children should be regulated, while others said tapping into private chats would be “difficult to justify from a Charter perspective.”

What could happen after content is flagged?

Canadian lawmakers will not only have to grapple with what constitutes online harm and what to do with it, but what happens to victims — and those who were found to have posted harmful content — after messages are flagged.

It’s not yet known what body would be responsible for overseeing Ottawa’s online safety framework, though appointing a specialized commissioner — like Australia’s “eSafety” Commissioner — has been floated as one option.

Experts agreed that platforms should have a review-and-appeal process for all moderation decisions, with some suggesting establishing an “internal ombudsman” to support victims.

It was noted that such a role would need to be kept entirely independent from government, potential commissioners, online platforms and law enforcement.

“Some suggested that the regime could begin with an ombudsperson as a hub for victim support, and grow into a body that adjudicates disputes later,” the report notes.

Experts, however, disagreed on how an ombudsman would operate, with some citing that users need an outside “venue” to express concerns due to mistrust of social media platforms.

Others “stressed that creating an independent body to make takedown decisions would be a massive undertaking akin to creating an entirely new quasi-judicial system with major constitutional issues related to both federalism and Charter concerns.”

Experts also raised concerns that recourse pathways simply might not be practical, given the volume of content, complaints and appeals the legislation might generate.

Ultimately, they concluded that “instead of simply abandoning the idea, it requires further development and testing.”

Raisa Patel is an Ottawa-based reporter covering federal politics for the Star. Follow her on Twitter: @R_SPatel

Previous articleWith P.E.I.’s inflation rate the highest in Canada, what should happen to the minimum wage?
Next articleSexual violence is deeply rooted in Canadian sports, experts say. What’s the fix? – National