EU AI Act, Article 11 and Annex IV — Template for providers of high-risk AI systems
Provide a general description of the AI system — intended purpose, provider, version, interaction with hardware/software, forms of distribution, hardware requirements, user interface, instructions for use, and foreseeable misuse.
Cover Annex IV(1)(a)-(h) and Article 13(3)(b)(ii) foreseeable misuse scenarios and prevention measures.
Describe how the AI system was developed: methods, design choices, architecture, training data (if ML-based), validation and testing, metrics, bias detection, and cybersecurity measures per Article 15.
Cover Annex IV(2)(a)-(h): development methods, design specifications, system architecture, data requirements, human oversight assessment, predetermined changes, validation/testing, and cybersecurity.
Describe capabilities and limitations, expected accuracy, foreseeable unintended outcomes and risks, technical measures for human oversight per Article 14, input data specifications, and logging capabilities per Article 12.
Cover Annex IV(3)(a)-(b): performance characteristics, human-machine interface, and logs required for traceability.
Explain why the chosen performance metrics are appropriate for this specific AI system and its intended purpose.
Justify the relevance of the metrics chosen for accuracy, robustness and discriminatory-impact monitoring.
Provide a detailed description of the risk management system established, implemented, documented and maintained in accordance with Article 9.
Cover risk identification, estimation and evaluation, risk management measures, and the continuous iterative process across the AI system's lifecycle.
Describe any relevant changes made by the provider to the system over its lifecycle.
Include model updates, retraining, new data sources, configuration changes and any substantial modifications under Article 3(23).
List the harmonised standards applied in full or in part. If no harmonised standards were applied, describe the solutions adopted to meet the Chapter III, Section 2 requirements and list other standards and technical specifications applied.
Reference CEN/CENELEC harmonised standards, common specifications under Article 41, or alternative technical solutions.
Provide a copy of the EU declaration of conformity drawn up in accordance with Article 47.
The declaration must identify the AI system, the provider, cite the regulation, confirm conformity with the Chapter III, Section 2 requirements, and be signed by the provider.
Provide a detailed description of the system in place to evaluate the AI system's performance in the post-market phase, including the post-market monitoring plan referred to in Article 72(3).
Describe data sources, monitoring frequency, incident reporting under Article 73 and how findings feed back into the risk management system.