Back to policy templates
Policy 05 of 15

AI Regulatory Compliance Policy

Maps regulatory and standards obligations to internal controls, assigns owners, and ensures audit-ready evidence for all applicable AI frameworks.

1. Purpose

This policy establishes how [Organization Name] identifies, tracks, and demonstrates compliance with AI-related regulations, standards, and contractual obligations. It creates the compliance management process, defines the requirement register, and confirms that evidence is maintained to satisfy regulators, auditors, certification bodies, and customers.

2. Scope

This policy covers:

  • All AI systems subject to external regulation (EU AI Act, GDPR, sector-specific rules).
  • All AI systems within the scope of voluntary standards (ISO/IEC 42001, NIST AI RMF).
  • All AI systems subject to contractual compliance obligations (customer audits, partner agreements).
  • All jurisdictions where the organization develops, deploys, or offers AI systems.

3. Framework overview and prioritization

The three primary AI governance frameworks address overlapping but distinct concerns. Organizations should understand their relationship:

Recommended approach: Start with the framework most relevant to your regulatory obligations (EU AI Act for EU operations, NIST AI RMF for US-focused organizations). Layer ISO/IEC 42001 for systematic management and certification. Use crosswalk mappings to avoid duplicating effort where frameworks overlap.

  • EU AI Act: Legal obligations. Mandatory for AI systems marketed or used in the EU. Non-compliance carries fines up to 35 million EUR or 7% of global turnover.
  • ISO/IEC 42001: Management system standard. Voluntary but increasingly expected by enterprise customers and partners. Certifiable through accredited bodies.
  • NIST AI RMF: Risk management process. Voluntary, widely adopted in the US. Provides structured methodology for identifying, assessing, and managing AI risks.

4. Cross-framework mapping

The following table maps key governance themes to their location in each framework, enabling teams to satisfy multiple obligations with a single control where possible.

Governance themeEU AI ActISO/IEC 42001NIST AI RMF
Risk assessmentArt. 9Clause 6.1, Annex BMAP, MEASURE
Governance structureArt. 4a, 9, 26Clause 5.1, 5.3GOVERN (GV-1, GV-2)
Technical documentationArt. 11, Annex IVClause 7.5MAP (MP-5)
Data governanceArt. 10Annex B (B.7)MAP (MP-3), MANAGE (MG-3)
TransparencyArt. 13, 50Clause 4.2GOVERN (GV-4)
Human oversightArt. 14Annex C (C.2)GOVERN (GV-5), MANAGE (MG-4)
Accuracy and robustnessArt. 15Clause 8.4MEASURE (MS-2)
Bias and fairnessArt. 10 (data), Recital 47Annex B (B.3)MEASURE (MS-2), MAP (MP-3)
Post-market monitoringArt. 72Clause 9.1MANAGE (MG-1, MG-2)
Incident reportingArt. 73Clause 10.2MANAGE (MG-4)
Third-party managementArt. 25 (value chain)Clause 8.5GOVERN (GV-6)
Training and competenceArt. 4a (AI literacy)Clause 7.2GOVERN (GV-3)

5. EU AI Act obligations

Key obligations for providers and deployers of high-risk AI systems:

  • Risk classification (Art. 6, Annex III): Classify all AI systems. High-risk triggers mandatory requirements.
  • Conformity assessment (Art. 43): Internal or third-party assessment before market placement.
  • Technical documentation (Art. 11, Annex IV): Technical file describing system, data, testing, and performance.
  • Transparency (Art. 13, 50): Information to deployers/users about capabilities and limitations. Notify users when interacting with AI.
  • Human oversight (Art. 14): Design for effective human oversight of operation.
  • Fundamental rights impact assessment (Art. 27): Deployers assess impact before use.
  • Post-market monitoring (Art. 72): Monitoring plan for deployed high-risk systems.
  • Incident reporting (Art. 73): Report serious incidents to market surveillance authorities without undue delay.
  • Registration (Art. 49): Register in EU database before market placement.
  • AI literacy (Art. 4a): Ensure staff have sufficient AI literacy for their role.

6. ISO/IEC 42001 requirements

Key clauses for the AI management system:

Certification: ISO/IEC 42001 certification requires an accredited third-party audit. The AI Governance Lead coordinates the certification program, including Stage 1 (documentation review) and Stage 2 (implementation audit), plus annual surveillance audits.

  • Clause 4: Context (interested parties, scope, AI system inventory).
  • Clause 5: Leadership (commitment, policy, roles).
  • Clause 6: Planning (risk assessment, objectives, AI impact assessment).
  • Clause 7: Support (resources, competence, awareness, documentation).
  • Clause 8: Operation (lifecycle, third-party, data management).
  • Clause 9: Performance evaluation (monitoring, internal audit, management review).
  • Clause 10: Improvement (nonconformity, corrective action, continual improvement).

7. Compliance management process

Placeholder. Populate with your organization's language for 7. Compliance management process.

7.1 Requirement register

The AI Governance Lead maintains a requirement register mapping each obligation to:

  • The specific clause or article number.
  • The internal control that addresses it.
  • The control owner responsible for implementation.
  • The evidence required to demonstrate compliance.
  • The current status (compliant, partially compliant, gap, not applicable).
  • Remediation plans and deadlines for gaps.

7.2 Evidence library

All compliance evidence is stored centrally with version control, retention tags, access controls, and timestamps. Evidence types include:

  • Policies and procedures (this document, related policies).
  • Risk assessments and treatment plans.
  • Model cards, data sheets, and technical documentation.
  • Testing reports (bias, fairness, security, performance).
  • Training records and competence assessments.
  • Meeting minutes and decision logs.
  • Incident reports and post-incident reviews.
  • Audit reports and remediation evidence.

7.3 Gap analysis

Gap analysis is performed when a new regulation takes effect, when a new AI system enters scope, annually as part of the review cycle, and when audit findings identify control weaknesses. Gaps are assigned remediation owners, deadlines, and tracked to completion.

7.4 Regulatory change monitoring

Legal and Compliance monitor regulatory developments. When a material change is identified, the requirement register is updated, affected control owners are notified, a gap analysis is triggered, and the AI Governance Committee is briefed.

8. Incident reporting obligations

FrameworkReporting obligationTimeline
EU AI Act (Art. 73)Report serious incidents to market surveillance authorityWithout undue delay, no later than 15 days after becoming aware
GDPR (Art. 33)Report personal data breaches to supervisory authorityWithin 72 hours of becoming aware
ISO/IEC 42001 (Clause 10.2)Record nonconformities and take corrective actionAs part of continual improvement process
NIST AI RMF (MG-4)Document and communicate risk management decisionsPer organizational process

9. Roles and responsibilities

RoleCompliance responsibilities
AI Governance LeadMaintains register, coordinates audits, tracks remediation, prepares reports, manages certification program.
Control OwnersImplement controls, produce and update evidence, remediate findings.
Legal and ComplianceMonitor regulations, advise on obligations, coordinate regulator interactions.
Model OwnersEnsure systems meet requirements, maintain technical documentation, support audits.
AI Governance CommitteeApprove strategy, review posture, resolve escalations.

10. Reporting cadence

  • Monthly: AI Governance Lead reviews register status and evidence freshness.
  • Quarterly: Compliance summary presented to AI Governance Committee.
  • Annually: complete compliance report to executive leadership covering framework coverage, gap trends, audit results, and regulatory outlook.

11. Exceptions

Temporary deferral of a compliance control requires written justification, compensating controls, a remediation deadline, and approval from the AI Governance Lead (or Committee for high-risk controls).

12. Review

This policy is reviewed annually or sooner when triggered by new regulations taking effect, material audit findings, or changes to the organization's AI scope.

Document control

FieldValue
Policy owner[AI Governance Lead]
Approved by[AI Governance Committee]
Effective date[Date]
Next review date[Date + 12 months]
Version1.0
ClassificationInternal

Ready to implement this policy?

Use VerifyWise to customize, deploy, and track compliance with this policy template.

AI Regulatory Compliance Policy | VerifyWise AI Governance Templates