By Michelle Abel, August 27, 2025
When the United States passed the Take It Down Act in May, it gave victims of non-consensual intimate imagery – whether real or AI-generated – something Canada still lacks: a clear, fast, and enforceable right to have their abuse removed from the internet. The law mandates takedown within 48 hours and imposes penalties on platforms that ignore or delay. It marks a turning point in holding digital platforms accountable for weaponized exposure.
Canada currently relies on a patchwork of criminal charges, privacy complaints, and civil lawsuits that can take weeks or months to act on. For victims, these delays are more than frustrating they can be deeply harmful. To better protect Canadians from the growing threat of synthetic sexual content, deepfakes, and AI-generated nudes, the country needs a new law: the Right to Online Privacy Protection.
When it comes to this kind of lurid content, the damage doesn’t come only from the images or videos themselves, but from the failure of our laws to remove it in time.
When AI-generated nudes of Taylor Swift went viral in early 2024, major platforms acted within hours – removing content, blocking hashtags, and updating moderation policies. But when a high school student in British Columbia faced a similar attack, it took days to flag the images – and weeks to have them removed, if at all. The difference? Celebrity status. Swift had a legal team and public pressure. Most victims do not. In Canada, they also lack a fast, enforceable right to have this abuse taken down. That disparity is what the proposed “Right to Online Privacy Protection Act” aims to fix.
According to a 2024 report from the Canadian Centre for Child Protection, 29,505 reports of online sexual exploitation – across all categories – marking a 41.9 per cent increase from 2021. Eighteen per cent of women reported experiencing unwanted sexual behaviour online, with younger women (33 per cent for women between 15 and 24), Indigenous women (30 per cent), and bisexual women (50 per cent) reporting higher levels of unwanted behaviour.
Victims deserve a statutory Right to Online Privacy Protection (ROPP)
In Canada, no single law grants an individual the right to demand takedown of non-consensual intimate images. Instead, victims must navigate section 162.1 of the Criminal Code of Canada, which was drafted to apply to “voyeurism,” lodge complaints regarding personal information mishandling under the Personal Information Protection and Electronic Documents Act (PIPEDA), or pursue tort claims like “intrusion upon seclusion” processes that are slow, uneven, and financially out of reach for many.
A new statutory Right to Online Privacy Protection (ROPP) would fill this gap, creating a constitutionally sound takedown right that would empower Canadians to demand removal of non-consensual intimate images.
This wouldn’t mean erasing all inconvenient online history. It would apply only to clearly personal or intimate content nudes, medical photos, sexualized deepfakes posted without consent and serving no public interest. Exceptions would be carved out for journalism, artistic expression, and legal transparency.
In short, it’s not about censorship. It’s about dignity.
Platforms must earn the privilege of access to Canadian markets
The framework rightly proposes conditional market access: if platforms want to operate in Canada or benefit from tax incentives they must meet baseline safety standards. That includes:
- Responding to takedown requests within 48 hours.
- Using hash-matching technologies like PhotoDNA to block known abusive content.
- Publishing annual transparency reports on how they handle image-based abuse.
If companies can restrict access to content based on geography for copyright claims with ease, they can do it for privacy violations too.
The United Kingdom already certifies digital platforms that comply with safety and privacy standards through frameworks like the Children’s Code, which sets out rules to protect children’s online data and privacy. Platforms that adhere to the Code’s requirements can seek certification to demonstrate their compliance, signalling a commitment to protecting young users. Canada could implement a “Digital Safety Trustmark,” giving users confidence and regulators leverage. (This proposed Digital Safety Trustmark draws conceptual inspiration from the United Kingdom’s ICO-approved Age Check Certification Scheme (ACCS).) Certification could help platforms defend against liability, while encouraging good-faith cooperation with law enforcement and victims.
International coordination is key but Canada must lead at home
Yet that shouldn’t stop Canada from leading. While Section 230 continues to shield platforms from liability for user content, the passage of the Take It Down Act shows that targeted, victim-centred laws can coexist with intermediary protections. Canada should take a similar approach – leveraging regulation without waiting for US legislative shifts.
Victims need a single, streamlined point of contact such as the proposed “Trusted Digital Safety Platform” (TDSP) to navigate the often-overwhelming process of reporting online abuse. Designed with survivors in mind, the TDSP would provide a secure and supportive space for individuals to file reports, verify their identity, preserve crucial digital evidence, and ensure that takedown requests are routed swiftly to the appropriate jurisdiction. By allowing survivors to easily file reports, verify their identity, preserve crucial digital evidence, and ensure takedown requests are quickly sent to the right authorities, the TDSP removes barriers, speeds up responses, and increases accountability. This streamlined approach empowers victims, reduces trauma, and helps create a safer digital space where abuse is addressed promptly and effectively.
In cases involving international hosts or offenders, the platform would facilitate coordination with INTERPOL, the RCMP, and global cybercrime units, preventing perpetrators from evading responsibility by shifting content across borders. Critically, the TDSP would operate in full compliance with Canadian privacy laws, respect survivors’ experiences and make safety a priority, and collaborate with trusted partners like Cybertip.ca to enhance cross-border enforcement and victim support.
It’s not enough to ask platforms to “do better.” Our laws must require it.
A balancing act – but not a difficult one
Some critics raise concerns about free expression. But the proposed Right to Online Privacy Protection framework is built on Charter-compliant principles. It exempts newsworthy content, legal records, and artistic works. It includes a Digital Rights Tribunal that can hear appeals swiftly and transparently.
As technology evolves especially with AI-generated abuse we need modern, balanced tools. The choice is not between free speech and protection. It’s between legal silence and meaningful safeguards.
Time for Parliament to act
The previous federal government proposed a sweeping Online Harms Act (Bill C‑63) a complex law aimed at hate speech, incitement, and exploitation. But as it stands, the bill raises serious Charter concerns around freedom of expression and due process, is overbroad in many of its aspects and still lacks the focused, victim-centred procedures that survivors of image-based abuse urgently need. That’s why lawmakers should start with something like the Right to Online Privacy Protection a victim-led remedy. It’s narrow – confined to clear instances of non-consensual image sharing – practical, and urgent.
Canada must not wait for another viral tragedy or cross-border lawsuit to prompt action. Victims of non-consensual image abuse deserve fast, fair, and effective redress now.
Michelle Abel is a recognized expert in familial trafficking and abuse, with extensive knowledge of adverse childhood experiences and coercive behaviour. She is the Founder of Bridge2Future, a Canadian non-profit organization dedicated to research, advocacy, and policy advice on generational trauma, intimate partner violence, and the commercial sexual exploitation of women and children. Abel also serves as vice president of Children/Youth and Global Affairs for the National Council of Women of Canada.



