1. Purpose
This policy establishes how [Organization Name] identifies, tracks, and demonstrates compliance with AI-related regulations, standards, and contractual obligations. It creates the compliance management process, defines the requirement register, and confirms that evidence is maintained to satisfy regulators, auditors, certification bodies, and customers.
2. Scope
This policy covers:
- All AI systems subject to external regulation (EU AI Act, GDPR, sector-specific rules).
- All AI systems within the scope of voluntary standards (ISO/IEC 42001, NIST AI RMF).
- All AI systems subject to contractual compliance obligations (customer audits, partner agreements).
- All jurisdictions where the organization develops, deploys, or offers AI systems.
3. Framework overview and prioritization
The three primary AI governance frameworks address overlapping but distinct concerns. Organizations should understand their relationship:
Recommended approach: Start with the framework most relevant to your regulatory obligations (EU AI Act for EU operations, NIST AI RMF for US-focused organizations). Layer ISO/IEC 42001 for systematic management and certification. Use crosswalk mappings to avoid duplicating effort where frameworks overlap.
- EU AI Act: Legal obligations. Mandatory for AI systems marketed or used in the EU. Non-compliance carries fines up to 35 million EUR or 7% of global turnover.
- ISO/IEC 42001: Management system standard. Voluntary but increasingly expected by enterprise customers and partners. Certifiable through accredited bodies.
- NIST AI RMF: Risk management process. Voluntary, widely adopted in the US. Provides structured methodology for identifying, assessing, and managing AI risks.
4. Cross-framework mapping
The following table maps key governance themes to their location in each framework, enabling teams to satisfy multiple obligations with a single control where possible.
| Governance theme | EU AI Act | ISO/IEC 42001 | NIST AI RMF |
|---|---|---|---|
| Risk assessment | Art. 9 | Clause 6.1, Annex B | MAP, MEASURE |
| Governance structure | Art. 4a, 9, 26 | Clause 5.1, 5.3 | GOVERN (GV-1, GV-2) |
| Technical documentation | Art. 11, Annex IV | Clause 7.5 | MAP (MP-5) |
| Data governance | Art. 10 | Annex B (B.7) | MAP (MP-3), MANAGE (MG-3) |
| Transparency | Art. 13, 50 | Clause 4.2 | GOVERN (GV-4) |
| Human oversight | Art. 14 | Annex C (C.2) | GOVERN (GV-5), MANAGE (MG-4) |
| Accuracy and robustness | Art. 15 | Clause 8.4 | MEASURE (MS-2) |
| Bias and fairness | Art. 10 (data), Recital 47 | Annex B (B.3) | MEASURE (MS-2), MAP (MP-3) |
| Post-market monitoring | Art. 72 | Clause 9.1 | MANAGE (MG-1, MG-2) |
| Incident reporting | Art. 73 | Clause 10.2 | MANAGE (MG-4) |
| Third-party management | Art. 25 (value chain) | Clause 8.5 | GOVERN (GV-6) |
| Training and competence | Art. 4a (AI literacy) | Clause 7.2 | GOVERN (GV-3) |
5. EU AI Act obligations
Key obligations for providers and deployers of high-risk AI systems:
- Risk classification (Art. 6, Annex III): Classify all AI systems. High-risk triggers mandatory requirements.
- Conformity assessment (Art. 43): Internal or third-party assessment before market placement.
- Technical documentation (Art. 11, Annex IV): Technical file describing system, data, testing, and performance.
- Transparency (Art. 13, 50): Information to deployers/users about capabilities and limitations. Notify users when interacting with AI.
- Human oversight (Art. 14): Design for effective human oversight of operation.
- Fundamental rights impact assessment (Art. 27): Deployers assess impact before use.
- Post-market monitoring (Art. 72): Monitoring plan for deployed high-risk systems.
- Incident reporting (Art. 73): Report serious incidents to market surveillance authorities without undue delay.
- Registration (Art. 49): Register in EU database before market placement.
- AI literacy (Art. 4a): Ensure staff have sufficient AI literacy for their role.
6. ISO/IEC 42001 requirements
Key clauses for the AI management system:
Certification: ISO/IEC 42001 certification requires an accredited third-party audit. The AI Governance Lead coordinates the certification program, including Stage 1 (documentation review) and Stage 2 (implementation audit), plus annual surveillance audits.
- Clause 4: Context (interested parties, scope, AI system inventory).
- Clause 5: Leadership (commitment, policy, roles).
- Clause 6: Planning (risk assessment, objectives, AI impact assessment).
- Clause 7: Support (resources, competence, awareness, documentation).
- Clause 8: Operation (lifecycle, third-party, data management).
- Clause 9: Performance evaluation (monitoring, internal audit, management review).
- Clause 10: Improvement (nonconformity, corrective action, continual improvement).
7. Compliance management process
Placeholder. Populate with your organization's language for 7. Compliance management process.
7.1 Requirement register
The AI Governance Lead maintains a requirement register mapping each obligation to:
- The specific clause or article number.
- The internal control that addresses it.
- The control owner responsible for implementation.
- The evidence required to demonstrate compliance.
- The current status (compliant, partially compliant, gap, not applicable).
- Remediation plans and deadlines for gaps.
7.2 Evidence library
All compliance evidence is stored centrally with version control, retention tags, access controls, and timestamps. Evidence types include:
- Policies and procedures (this document, related policies).
- Risk assessments and treatment plans.
- Model cards, data sheets, and technical documentation.
- Testing reports (bias, fairness, security, performance).
- Training records and competence assessments.
- Meeting minutes and decision logs.
- Incident reports and post-incident reviews.
- Audit reports and remediation evidence.
7.3 Gap analysis
Gap analysis is performed when a new regulation takes effect, when a new AI system enters scope, annually as part of the review cycle, and when audit findings identify control weaknesses. Gaps are assigned remediation owners, deadlines, and tracked to completion.
7.4 Regulatory change monitoring
Legal and Compliance monitor regulatory developments. When a material change is identified, the requirement register is updated, affected control owners are notified, a gap analysis is triggered, and the AI Governance Committee is briefed.
8. Incident reporting obligations
| Framework | Reporting obligation | Timeline |
|---|---|---|
| EU AI Act (Art. 73) | Report serious incidents to market surveillance authority | Without undue delay, no later than 15 days after becoming aware |
| GDPR (Art. 33) | Report personal data breaches to supervisory authority | Within 72 hours of becoming aware |
| ISO/IEC 42001 (Clause 10.2) | Record nonconformities and take corrective action | As part of continual improvement process |
| NIST AI RMF (MG-4) | Document and communicate risk management decisions | Per organizational process |
9. Roles and responsibilities
| Role | Compliance responsibilities |
|---|---|
| AI Governance Lead | Maintains register, coordinates audits, tracks remediation, prepares reports, manages certification program. |
| Control Owners | Implement controls, produce and update evidence, remediate findings. |
| Legal and Compliance | Monitor regulations, advise on obligations, coordinate regulator interactions. |
| Model Owners | Ensure systems meet requirements, maintain technical documentation, support audits. |
| AI Governance Committee | Approve strategy, review posture, resolve escalations. |
10. Reporting cadence
- Monthly: AI Governance Lead reviews register status and evidence freshness.
- Quarterly: Compliance summary presented to AI Governance Committee.
- Annually: complete compliance report to executive leadership covering framework coverage, gap trends, audit results, and regulatory outlook.
11. Exceptions
Temporary deferral of a compliance control requires written justification, compensating controls, a remediation deadline, and approval from the AI Governance Lead (or Committee for high-risk controls).
12. Review
This policy is reviewed annually or sooner when triggered by new regulations taking effect, material audit findings, or changes to the organization's AI scope.
Document control
| Field | Value |
|---|---|
| Policy owner | [AI Governance Lead] |
| Approved by | [AI Governance Committee] |
| Effective date | [Date] |
| Next review date | [Date + 12 months] |
| Version | 1.0 |
| Classification | Internal |