Back to Bills

Promotion of Safety in the Digital Age Act

Full Title:
An Act to enact the Protection of Minors in the Digital Age Act and to amend two Acts

Summary#

  • This bill, called the Promotion of Safety in the Digital Age Act, aims to make the internet safer for young people.
  • It creates new rules for platforms used by minors, strengthens reporting of online child sexual abuse material, and adds new crimes and tools in the Criminal Code to tackle deepfake sexual images and online harassment.

Key changes:

  • Sets a duty of care for platforms used by minors to prevent or reduce harms like sexual exploitation, self‑harm content, addictive design, and marketing of illegal products to minors.
  • Requires strong default safety and parental controls, age‑verification that protects privacy, clearer ad labels, and easy ways to report problems.
  • Bans design tricks that weaken safety settings and bans requiring a digital ID to use a service.
  • Lets minors or their parents sue platforms for serious harm tied to a failure of duty of care.
  • Updates reporting rules so internet services must notify a designated police body, share certain technical data for clear child sexual abuse content, and preserve related data for one year.
  • Criminalizes sharing fake “intimate” images (including AI deepfakes) without consent; adds specific treatment of online harassment; and allows courts to identify anonymous harassers and restrict their internet use.

What it means for you#

  • Children and teens

    • Default settings will be stricter. Your location sharing is off by default and you will be told if it’s being tracked.
    • You can turn off algorithmic recommendations and see content in time order instead.
    • You’ll have tools to limit time on apps and delete your account and personal data.
    • You should see fewer ads or promotions for alcohol, cannabis, gambling, tobacco, or porn.
    • It will be easier to report harmful content or contact. Platforms must respond in a timely way.
    • Posting or sharing fake nude or sexual images of you without consent is a crime, with strong penalties.
  • Parents and guardians

    • You will get default parental controls for children under 16 and can manage privacy and account settings.
    • You can see time‑spent metrics and block purchases.
    • You will be notified if a child disables default parental controls.
    • You can sue a platform if its failure to protect causes serious harm to your child (physical, psychological, or major financial loss).
  • Platform operators (social media, gaming, apps, hosting, email and messaging services)

    • You must design with minors’ best interests in mind and reduce listed harms (e.g., sexual exploitation, self‑harm promotion, addiction‑like patterns).
    • Provide clear safety settings, parental controls, ad labels, data use explanations, and a reporting channel with an internal response process.
    • Set protective defaults, offer a chronological feed option, and use privacy‑preserving age checks.
    • Keep audit logs, complete an independent safety review every two years, and publish annual reports with risk and usage metrics.
    • You cannot use “dark patterns” that push users to weaken safety or require a digital ID to access services.
    • Non‑compliance can bring fines up to tens of millions of dollars.
  • Advertisers and creators

    • Ads aimed at minors must be clearly labeled. Platforms must disclose why a minor was targeted and how personal data was used.
    • Paid endorsements by users must be disclosed as ads.
  • People who are harassed online

    • Repeated online contact can count as criminal harassment.
    • Anonymous or fake‑identity harassment can lead to tougher sentences. Courts can order steps to identify the harasser.
    • Courts can order no‑contact conditions and limit an offender’s internet use.
    • Victims can get restitution for costs to remove intimate or fake intimate images from the internet.
  • Law enforcement and regulators

    • A single designated body will receive mandatory reports about child sexual abuse material from internet services, with certain technical data when the content is clearly illegal.
    • Required data must be preserved for one year and reported annually to federal ministers.
    • The CRTC will set guidelines for research involving minors.
  • Timing

    • Most platform rules take effect 18 months after the law is passed; some transparency duties start two years after.

Expenses#

No publicly available information.

Proponents' View#

  • Gives kids safer defaults and tools, reduces exposure to harmful content, and limits addictive design.
  • Holds big platforms accountable with a clear duty of care and real penalties.
  • Helps parents protect their children and gives families the right to seek damages in court when serious harm occurs.
  • Cracks down on AI‑made fake sexual images and modern forms of online harassment.
  • Streamlines reporting of child sexual abuse material to police and improves investigations with preserved data.
  • Increases transparency about algorithms, ads, and data use that affect minors.

Opponents' View#

  • Age verification and parent notifications could create new privacy risks and may collect more data on young users.
  • A broad “duty of care” might lead platforms to over‑remove lawful content or restrict teens’ access to information.
  • Default parental controls for all children may not fit older teens’ need for privacy and autonomy.
  • Compliance could be hard for small or foreign platforms, with high costs and complex reporting and audit duties.
  • The ban on “digital IDs” could conflict with some age‑assurance or safety solutions that rely on digital credentials.
  • Expanded data preservation and powers to identify anonymous users may raise civil liberties and free‑expression concerns.

Timeline

Jun 19, 2025 • House

First reading

Technology and Innovation
Criminal Justice
Social Issues