Back to Bills

New privacy rights and AI safety rules

Full Title: An Act to enact the Consumer Privacy Protection Act, the Personal Information and Data Protection Tribunal Act and the Artificial Intelligence and Data Act and to make consequential and related amendments to other Acts

Summary#

Bill C-27 (Digital Charter Implementation Act, 2022) would overhaul private‑sector privacy law and set first‑time federal rules for artificial intelligence (AI). It creates three laws: the Consumer Privacy Protection Act (CPPA), the Personal Information and Data Protection Tribunal Act, and the Artificial Intelligence and Data Act (AIDA). Most details take effect on dates and regulations set later by the federal government (Coming-into-Force by order in council).

  • Replaces Part 1 of PIPEDA with the CPPA and renames the rest the Electronic Documents Act (Consequential Amendments).
  • Creates a new Tribunal to hear appeals of Privacy Commissioner decisions and to set administrative monetary penalties (PIDPTA; CPPA Part 2).
  • Sets high fines for serious privacy breaches and offences (CPPA Administrative Monetary Penalties; Offences and punishment).
  • Gives individuals new rights (e.g., data disposal on request, explanations for certain automated decisions, data mobility when frameworks exist) (CPPA Part 1).
  • Regulates “high‑impact” AI systems with risk controls, record‑keeping, public plain‑language notices, and harm reporting; adds AI‑specific offences (AIDA Part 1–2).

What it means for you#

  • Households

    • You can ask businesses what personal data they have about you, how they use it, and to whom they disclosed it. Firms must respond in plain language within 30 days (CPPA Access, Time limit).
    • You can request disposal (deletion or anonymization) of your data in set cases, and firms must inform service providers to do the same (CPPA Disposal at individual’s request).
    • If a company uses an automated decision system that significantly affects you, you can request an explanation of the decision, including key factors and data sources (CPPA Access to automated decision system explanation).
    • You must be told about data breaches that pose a “real risk of significant harm,” such as identity theft or financial loss (CPPA Security safeguards — notification).
    • A future “data mobility framework” could let you port your data from one service to another once regulations are made (CPPA Data mobility).
  • Workers (in federally regulated employers only)

    • Employers that are federal works, undertakings, or businesses can collect, use, or disclose your data without consent if necessary to manage the employment relationship and you are informed (CPPA Employment relationship — FWUBs).
  • Businesses and non‑profits engaged in commercial activities

    • You must run a privacy management program, designate a responsible person, train staff, and keep policies available in plain language (CPPA Accountability; Privacy management program; Openness and Transparency).
    • Consent must be valid and informed. You must not force consent beyond what is needed for a product or service (CPPA Consent; Consent — provision of product or service).
    • There are narrow no‑consent grounds for specified business activities (security, product safety) and for “legitimate interests” if an assessment shows benefits outweigh adverse effects and records are kept (CPPA Business activities; Legitimate interest and Record of assessment).
    • Transfers to service providers do not require consent, but you must ensure equivalent protection by contract or other means (CPPA Same protection; Transfer to service provider).
    • Mandatory breach reporting to the Privacy Commissioner and to affected individuals when risk is significant; keep breach records (CPPA Security safeguards — report, notify, records).
    • High fines apply for serious contraventions. Administrative penalties can reach the higher of $10,000,000 and 3% of global revenue; offences up to the higher of $25,000,000 and 5% (CPPA Administrative Monetary Penalties — maximum; Offence and punishment).
  • AI developers, deployers, and managers (interprovincial/international trade)

    • You must assess if your system is “high‑impact” (to be defined in regulation) and, if so, identify, mitigate, and monitor risks of harm and biased output; keep specified records (AIDA Assessment; Measures related to risks; Monitoring; Keeping records).
    • If you make a high‑impact system available or manage its operation, you must publish a plain‑language description, including intended use and mitigation measures (AIDA Publication of description).
    • You must notify the Minister “as soon as feasible” if use results or is likely to result in material harm (to be defined by regulation) (AIDA Notification of material harm).
    • The Minister can order record production, audits, implementation of measures, public notices, or in urgent cases, cessation of use to prevent serious, imminent harm (AIDA Ministerial orders; Cessation).
    • Offences include using illegally obtained personal information for AI, and making an AI system available knowing it will likely cause serious harm. Fines can reach the higher of $25,000,000 and 5% of global revenue (AIDA Part 2 — Offences and Punishment).
  • Local governments and federal institutions

    • CPPA does not apply to government institutions under the Privacy Act (CPPA Application — Limit). AIDA does not apply to named national security bodies and prescribed entities (AIDA Non‑application).
  • Timing

    • Most provisions take effect on dates set by Cabinet and after regulations are made, especially for AI “high‑impact” criteria and data mobility (Coming-into-Force clauses; AIDA regulation-making).

Expenses#

Estimated net cost: Data unavailable.

  • No explicit appropriations in the bill text. Implementation relies on future orders and regulations (Coming-into-Force; Regulation-making).
  • Creates a new federal Tribunal supported by the Administrative Tribunals Support Service of Canada (PIDPTA; ATSSC schedule amendment).
  • Authorizes designation of an Artificial Intelligence and Data Commissioner within ISED to assist the Minister (AIDA Administration).
  • Administrative monetary penalties are recoverable as debts to the Crown; potential penalty revenue is unknown (CPPA Recovery as debt due to Her Majesty; AIDA AMPs framework by regulation).
ItemAmountFrequencySource
Set-up and operation of Personal Information and Data Protection TribunalData unavailableOngoingPIDPTA; ATSSC amendment
Privacy Commissioner expanded duties (guidance, audits, inquiries)Data unavailableOngoingCPPA Part 2 (Powers, duties)
Artificial Intelligence and Data Commissioner and analystsData unavailableOngoingAIDA Administration
Penalty revenues (privacy and AI)Data unavailableVariableCPPA Administrative Monetary Penalties; AIDA AMPs/Offences

Proponents' View#

  • Strengthens privacy rights and enforcement, aligning Canada with major partners. High fines (up to higher of $25,000,000 and 5% of global revenue for offences) create real deterrence (CPPA Offence and punishment; AIDA Punishment).
  • Clear rules for consent, data disposal, breach notice, and explanations for impactful automated decisions increase trust for consumers (CPPA Consent; Disposal; Security safeguards; Automated decision system explanation).
  • Creates due process through an expert Tribunal to hear appeals and set penalties, reducing litigation burden on Federal Court (PIDPTA; CPPA Appeals, Imposition of penalty).
  • Supports innovation with practical tools: codes of practice and certification programs; de‑identified data can be used for internal R&D under safeguards (CPPA Codes of practice and certification; Research, analysis and development).
  • Introduces baseline AI safety: risk management for high‑impact systems, public transparency notices, and harm reporting to reduce biased outputs and injuries (AIDA Measures related to risks; Publication of description; Notification of material harm).
  • Flexible, regulation‑first design lets government tailor “high‑impact” criteria and “material harm” definitions as technology evolves (AIDA Regulation-making).

Opponents' View#

  • Consent exceptions may be broad. “Legitimate interest” collection without consent depends on internal assessments that may be hard to audit, risking over‑collection (CPPA Business activities — Legitimate interest and Record of assessment).
  • Transfers to service providers without consent and wide disclosure exceptions to government institutions could weaken individual control (CPPA Transfer to service provider; Disclosures to Government Institutions).
  • Many core AI elements are left to future regulations (e.g., definition of “high‑impact,” required measures, “material harm”), creating uncertainty for developers and users and delaying real protections (AIDA Regulations — Governor in Council; Minister).
  • Minister’s order powers (including cease‑use orders) are broad and could chill deployment, especially for small firms lacking compliance resources (AIDA Cessation; Ministerial orders).
  • Overlap and potential conflict with provincial privacy laws (e.g., Quebec Law 25) increase compliance complexity for national firms (CPPA Orders exempting “substantially similar” provincial laws).
  • Multi‑step enforcement (Commissioner inquiry, Tribunal appeal, then private right of action) could prolong resolution for complainants and add costs for all parties (CPPA Part 2 — Inquiries, Appeals; Private Right of Action).
Technology and Innovation
Trade and Commerce
Labor and Employment

Votes

Vote 89156

Division 300 · Agreed To · April 24, 2023

For (64%)
Against (34%)
Paired (2%)
Vote 89156

Division 301 · Agreed To · April 24, 2023

For (63%)
Against (35%)
Paired (2%)