By Konrad Von Finckenstein and Peter Menzies, February 7, 2022
The Liberal government reintroduced its controversial internet regulation bill on Wednesday, and it is likely to bring forward its “online harms” legislation, which also caused concerns, in the weeks to come. A Heritage Canada report released Thursday documented the concerns about the government’s initial approach on this issue.
Last year, then-Heritage Minister Steven Guilbeault found himself facing steep opposition to his plans to roll out online harms legislation that was meant to address hate speech as well as sexually exploitative content and material that promoted terrorism. As former leaders at the CRTC who have closely watched this debate, we think there’s a better way for the new heritage minister, Pablo Rodriguez, to move forward on this file.
Mr. Guilbeault’s legislation was aimed at moderating problematic aspects of social media such as Facebook and Twitter. It recommended the establishment of a heavily resourced regulator able to issue 24-hour online takedown orders and demand that posts deemed suspicious by algorithms be immediately reported to police.
A consultation on this approach solicited waves of negative feedback from rights and internet freedom organizations. As the Canadian Internet Policy and Public Interest Clinic and the Citizen Lab said in their submission to Heritage Canada, the government “risks conscripting the private sector to engage in a form of dragnet surveillance that would have a chilling effect on people’s communication … Such a requirement has no place in Canadian legislation, especially in tandem with mandatory reporting to law enforcement.”
Mr. Rodriguez has wisely chosen to apply sober second thought to Mr. Guilbeault’s heavy-handed approach. But with even Facebook calling for regulation, it’s inevitable that Canada will join the 48 other nations that have already acted in this area. A better solution would be shifting the focus away from the problematic government censorship by proxy approach and instead ensuring that ideas are exchanged through social media in a responsible fashion that protects liberal democratic traditions while monitoring online safety.
In an upcoming policy paper, we propose the creation of the Social Media Responsibility Act, which would minimally affect innovation while defending free speech and ensuring companies don’t tolerate egregious concerns already governed by the Criminal Code such as hate speech, child sexual exploitation, intimate images and incitement of violence and terrorism. While the government’s original proposal suppressed free expression, our proposal is designed to make sure social media companies defend it while enforcing codes of conduct approved by an independent regulator composed of retired judges, human rights advocates and social media specialists. And, whereas the government proposal would have had people reported to police through electronic surveillance, our plan would ensure due process and that no content gets removed or reported without having been reviewed by a human being.
The Social Media Responsibility Act would define the targeted companies, a.k.a. social media enterprises (SMEs), and establish a regulator to be known as the Social Media Responsibility Council. It, in turn, would oversee the work of a social media commissioner reporting to it as its chief operating officer.
The act would oblige each SME to register with the council and create its own code of conduct, which would then require the regulator’s approval. The council would also define the duties and responsibilities of SMEs, monitor the implementation and administration of those codes, impose penalties for violations and resolve differences between users and SMEs as to whether content is harmful and deserves to be removed.
The council would itself be composed of nine members, each appointed by cabinet for a single seven-year term. The non-renewability of the term is important as it minimizes the chances of council members being suspected of pandering to the government in order to get reappointed. Three of these council members would come from the social media field, three more would require backgrounds in civil rights and the final three would be retired members of the Canadian judiciary.
The social media commissioner will be the organization’s chief executive and will be responsible for making recommendations to the council regarding violations and their remedies including fines and – in cases of extreme non-compliance – could go as far as proposing an application to the courts to block websites.
Writ large, the role of this council would not be to regulate the internet but to ensure that large social media companies operating upon it have appropriate rules of engagement in place and are applying them in a fair and non-partisan fashion while providing a venue for the resolution of disputes. We think this structure would strike a better balance between protecting the rights of citizens to communicate and making the online world a safer and more responsible space.
Konrad von Finckenstein is a senior fellow at the C.D. Howe Institute. Previously he was chair of the CRTC. Peter Menzies is a senior fellow at the Macdonald-Laurier Institute. Previously he was vice-chair of the CRTC.