By Peter Menzies, February 9, 2026
If regulating social media to protect children from online harms was a simple matter, the nation’s Culture and Identity Minister, Marc Miller, wouldn’t be vowing to “act swiftly” almost five years after the Liberals tabled their first effort at an online harms bill.
That initial legislation, titled Bill C-36, came two years after then-public safety minister Ralph Goodale first raised the issue. That triggered a year of public consultations before the government rolled out legislation that died when the 2021 election was called.
So, seven years after a problem was identified and legislative solutions were proposed, only to die, twice, on the order paper, the government is now promising to “act swiftly?” The shiny-faced seven-year-old child that Mr. Goodale might have first worried about is now a 14-year-old with whom having a conversation is almost impossible because their face is always staring at a phone.
There are multiple ways to explain the delay, but the most obvious is that the government’s previous efforts to address online harms attempted to do a lot more than to protect the innocent from social media. They tried to police speech.
Bill C-63, which died when former prime minister Justin Trudeau prorogued Parliament nearly a year ago, delved into tricky amendments to the Criminal Code and Human Rights Act that could be Charter-challenged. The bill was assigned to then-justice minister Arif Virani.
It was inspired by Britain’s Online Safety Act, which has led to thousands of people being arrested for their social media posts and has enraged free speech advocates.
Particularly problematic in Canada was C-63’s Orwellian attempt to authorize the Canadian Human Rights Commission – where the process is the punishment – to accept complaints alleging discrimination based on people’s interpretation of social media posts. Bill C-63, essentially, used child protection to smuggle in dangerously illiberal proposals.
Fortunately, it died along with Mr. Trudeau’s tenure as prime minister. His replacement, Mark Carney, has so far presented himself as a far more practical man. If and when a fresh version of the Online Harms Act rolls out, we’ll hopefully see an abandonment of his predecessor’s illiberal instincts.
It will be good news if that is the case. A slimmed-down, targeted Online Harms Act focusing entirely on child safety that forbids anyone under the age of 14 from accessing platforms such as Facebook, Instagram, X and TikTok is likely to be unopposed politically or by the public. Seeing as the platforms in question already require users to be 13 years old, that should be a relatively straightforward step.
I say relatively because the government first must avoid doing what it did previously when it tried to appease pressure groups by expanding the legislation into speech control.
Then it needs to define the problem, or problems, it is trying to address. Is it protecting children from online predation (social media companies already take this fairly seriously), protecting them from online social media addiction or, more broadly, protecting them from the internet? Each comes with its own complications and risky consequences, one of which is that children may wind up in less safe places than those they currently occupy.
As Mark Musselman, a former entertainment lawyer and PhD student, recently pointed out, even the most virtuous restrictions have unintended consequences. Britain’s age verification for online pornography, for instance, led to an immediate 47-per-cent decline in viewership on Pornhub, but that was followed by a spike in VPN use. This doesn’t detract from the virtue of the legislation, but it does illustrate how internet content regulation risks becoming an endless game of whack-a-mole.
In Australia, where a ban on social media for anyone under age 16 has just been implemented, it’s too early to judge its effectiveness, although some reports indicate under-16s are finding ways around the legislation, at times with the (witting and unwitting) assistance of their parents.
While the ban Down Under entrenches the cultural point that maturity is required for social media use, it’s difficult to say how much of the “problem” has been solved and how much of it has just moved to ChatGPT or dodgy sites such as 4chan. In other words, parental oversight is still required.
The effectiveness of a new Canadian bill will depend entirely on the precision of its language, the clarity of its ambitions and the manner of its implementation. Hopefully, those won’t involve the creation of a new regulator and will consider straightforward solutions.
Child safety is as important and noble an ambition as there is, and Mr. Carney’s government should be applauded for addressing it. Hopefully, it creates legislation that precisely defines the problem and addresses it in a fashion consistent with the Prime Minister’s penchant for pragmatism.
Peter Menzies is a senior fellow at the Macdonald-Laurier Institute.





