Back to Bills

Online Platforms Must Protect Minors

Full Title: An Act to enact the Protection of Minors in the Digital Age Act and to amend the Criminal Code

Summary#

This bill sets new safety rules for online platforms used by minors and updates the Criminal Code to address deepfake sexual images and online harassment. Part 1 creates a “duty of care” for platforms to reduce risks to minors and gives parents more control. Part 2 adds criminal penalties for publishing false intimate images and for online harassment, and enables courts to identify anonymous harassers.

  • Platforms must design and run their services to reduce specific harms to minors and add strong default safety settings and parental controls (Part 1, s.4–6).
  • Parents must be asked for express consent before a child under 16 uses a platform, and can delete a child’s account and data (Part 1, s.5(4), s.10(2)).
  • Platforms must offer an opt-out from personalized recommendation systems and allow a chronological feed (Part 1, s.5(1)(d)(i)).
  • The bill bans “dark patterns” that weaken safety settings and bans using minors’ data to advertise unlawful products to them (Part 1, s.9(1)–(2)).
  • Criminal Code changes create an offence for publishing false intimate (deepfake) sexual images and expand tools against online harassment, including identifying anonymous offenders (Criminal Code s.162.1(1.1), s.264(2)(b.1), s.810(2.1)).

What it means for you#

  • Households and parents

    • Default safety settings and parental controls are required for child users under 16, with an option for parents to opt out (Part 1, s.5(2), s.6(2)–(3)).
    • Parents must be notified if a minor turns off default parental controls (Part 1, s.6(5)).
    • Parents must give express consent before a child under 16 first uses a platform; operators must make reasonable efforts to obtain it (Part 1, s.10(2)–(3)).
    • Parents and children can delete the child’s account and personal data, limit screen time, and control contacts and geolocation sharing (Part 1, s.5(1), s.5(4)).
    • Platforms must verify parent contact info before allowing a known or likely minor to operate an account (Part 1, s.4(1)(g)).
  • Minors (under 18)

    • Platforms must reduce exposure to online bullying, sexual exploitation, self-harm content, and addiction-like designs (Part 1, s.4(1)(a)–(f)).
    • You can opt out of personalized recommendations and use a chronological feed (Part 1, s.5(1)(d)(i)).
    • Platforms cannot use your data to advertise alcohol, cannabis, tobacco, gambling, pornography, or controlled substances to you (Part 1, s.9(1)).
    • Platforms cannot require a “digital identifier” to access content or services (Part 1, s.9(3)).
  • Users and victims of online abuse

    • Publishing or sharing a false intimate (deepfake) sexual image without consent becomes a crime; courts can order internet restrictions and deletion of such content (Criminal Code s.162.1(1.1), s.164, s.164.1).
    • Courts can order offenders to pay reasonable costs to remove an intimate or false intimate image from the internet (Criminal Code s.738(1)(e)).
    • Online harassment via the internet or social media is explicitly covered; anonymous or fake-identity harassment is an aggravating factor at sentencing (Criminal Code s.264(2)(b.1), s.264(4)(c)).
  • Businesses and platforms (online services/apps, including social media and gaming)

    • New duty of care to prevent or mitigate listed harms to minors in product design and operations (Part 1, s.4(1)).
    • Required features: robust safety settings, parental controls, reliable privacy-preserving age verification, reporting channel, clear disclosures on data use and ads, and ad labels/targeting explanations (Part 1, s.5–6, s.8, s.10–11).
    • Prohibited: designing interfaces that undermine safety settings, using minors’ data to market unlawful products to them, and requiring digital IDs for access (Part 1, s.9).
    • Recordkeeping, a biennial independent review, and annual public reporting on risks, harms, and mitigations are required (Part 1, s.12).
    • Non-compliance can lead to fines up to CAD $25,000,000 on indictment or $20,000,000 on summary conviction for core duties; up to $10,000,000 for disclosure/record rules (Part 1, Offences and Punishment).
    • A due diligence defence is available (Part 1, s.14).
  • Law enforcement and courts

    • Ability to seek production orders to identify anonymous online harassers where conditions are met (Criminal Code s.810(2.1)).
    • Deepfake sexual-image offences are added to seizure/forfeiture and deletion order powers and to offences eligible for sex offender registration orders (Criminal Code s.164, s.164.1, s.490.011(a)(x.1)).
  • Timing

    • Criminal Code changes take effect on Royal Assent (Part 2).
    • Most platform duties take effect 18 months after Royal Assent; transparency, guidelines, offences, and private lawsuits start 2 years after Royal Assent (Part 1, Coming into Force).

Expenses#

  • Estimated net cost: Data unavailable.

  • Government fiscal information

    • No explicit appropriations in the bill text (Part 1, general).
    • The CRTC must issue guidelines for market and product research involving minors; related administrative costs are not stated (Part 1, s.13).
    • Enforcement, court, and policing costs for new and expanded Criminal Code offences: Data unavailable.
    • Potential fine revenue from non-compliant platforms: Data unavailable.
  • Private-sector compliance costs

    • Platform redesign, age verification, reporting, audits, and recordkeeping duties may create costs for operators: Data unavailable.

Proponents' View#

  • The duty of care and required design changes address bullying, sexual exploitation, self-harm content, and addiction-like use among minors by requiring prevention and mitigation in platform design (Part 1, s.4(1)).
  • Strong defaults and parental controls protect younger users by default, including controls on communication, data sharing, autoplay, notifications, and purchases (Part 1, s.5–6).
  • Transparency and choice improve user autonomy: opt-out of recommender systems with a chronological feed, ad labels, and clear reasons for any targeting of minors (Part 1, s.5(1)(d)(i), s.11).
  • The bill directly targets harmful practices: banning “dark patterns,” restricting ads for unlawful products to minors, and requiring privacy-preserving age checks (Part 1, s.5(3)(a), s.9(1)–(2)).
  • New Criminal Code tools respond to deepfake sexual abuse and online harassment, including higher penalties, restitution for removal costs, and powers to identify anonymous offenders (Criminal Code s.162.1(1.1), s.738(1)(e), s.810(2.1)).
  • Significant fines (up to CAD $25,000,000) and a private right of action create strong incentives for compliance and remedies for families (Part 1, Offences; s.15).

Opponents' View#

  • Scope and compliance burden may be heavy, especially for small or foreign platforms, given the broad definition of “operator” and multiple mandated features, audits, and reports (Part 1, Definitions; s.5–6, s.12).
  • Age verification and parent contact verification may raise privacy and data security concerns despite the requirement to preserve privacy, and could be difficult to implement reliably (Part 1, s.4(1)(g), s.5(3)(a)).
  • Some terms are broad (e.g., “addiction-like behaviours,” “harmful to their dignity”), creating uncertainty and potential over-removal of content to avoid liability (Part 1, s.4(1)(c), (f)).
  • The ban on requiring a “digital identifier” could limit certain age-assurance solutions and conflict with other safety goals, creating a trade-off between access and verification (Part 1, s.9(3)).
  • Expanded powers to identify anonymous online harassers may affect online anonymity and could be overused if safeguards are not applied carefully; courts must assess necessity (Criminal Code s.810(2.1)).
  • Adding false intimate image offences to the list of primary offences may lead to sex offender registration orders upon conviction, with long-term consequences, raising proportionality concerns in some cases (Criminal Code s.490.011(a)(x.1)).

Timeline

Sep 16, 2024 • House

First reading

Technology and Innovation
Criminal Justice
Social Issues