AI prefill from your classifier answers
Answers from the AI System Classifier flow into the Risk Management sections so you start from an 80 percent draft, not a blank page.
Free template — Article 9
An editable Risk Management System aligned with every requirement of Article 9 of the EU AI Act — no sign-up, no consultant fees, no guesswork.
Aligned with
Providers of high-risk AI systems save consultant fees; the template covers all seven Article 9 obligations and cites the relevant sub-paragraphs inline so auditors can trace each section back to the regulation.
Article 9(1) requires every provider of a high-risk AI system to establish, implement, document and maintain a risk management system. Article 9(2) adds that it must run as a continuous iterative process across the entire lifecycle.
A Risk Management System is not required for non-high-risk systems, but deployers of high-risk systems still have their own obligations under Article 26 and, where applicable, Article 27 (FRIA).
Seven structured sections aligned one-to-one with the Risk Management generator inside Witness, so you can start on paper and migrate to the online tool without losing work.
Known and reasonably foreseeable risks to health, safety and fundamental rights when the system is used in accordance with its intended purpose.
Likelihood and severity for intended use and reasonably foreseeable misuse, plus the evaluation methodology.
Elimination, reduction, mitigation and information measures — considering the effect of their combined application.
Residual risks judged acceptable and communicated to deployers, including the relevant overall residual risk.
Testing procedures, performance metrics, prior defined probabilistic thresholds and testing cadence — including real-world testing where appropriate.
Lifecycle plan, update triggers, review schedule, documentation control and integration with Article 72 post-market monitoring.
Specific consideration for children and other vulnerable groups where the system is likely to affect them.
Open the template in your browser to preview it, print it straight to PDF from the browser menu, or save it locally to edit offline.
HTML template, approximately 12 kB. Prints cleanly to PDF via your browser.
Free Risk Management System Template
1. Risk identification
Art. 9(2)(a)
2. Risk estimation and evaluation
Art. 9(2)(b)
3. Risk management measures
Art. 9(3), 9(5)
4. Residual risk and deployer communication
Art. 9(5)
+ 3
Upgrade path
The downloadable template gets you started. The guided Risk Management generator inside Witness takes you the rest of the way.
Answers from the AI System Classifier flow into the Risk Management sections so you start from an 80 percent draft, not a blank page.
Each field links to the exact Article 9 sub-paragraph it implements, with EUR-Lex references available inline.
Submit the completed RMS and download a formatted deliverable. Version history captures every iteration — mandatory under Article 9(2).
Requires a Witness account with Professional access and active annual maintenance.
Article 9(1) requires every provider of a high-risk AI system to establish, implement, document and maintain a risk management system. This applies to high-risk systems classified under Article 6(1) (safety components of products regulated under Annex I harmonisation legislation) and Article 6(2) (systems listed in Annex III). Deployers of high-risk systems have their own obligations under Article 26 but are not themselves required to run an Article 9 RMS for systems they only use.
Article 9(2) requires the risk management system to run throughout the entire lifecycle of the high-risk AI system. It is not a one-off deliverable. Providers must establish the RMS before placing the system on the market or putting it into service and must continue to operate it as a continuous iterative process while the system remains available — including during the design, development, deployment and post-market phases.
Article 9(2)(e) requires the risk management system to be regularly and systematically reviewed and updated based on new testing results, changes in the state of the art, and data gathered through the Article 72 post-market monitoring system. There is no fixed minimum frequency in the regulation, but a documented review cadence — typically quarterly or at least annually — combined with event-driven updates (new incident, substantial modification, new deployment context) is the expected standard.
Yes. Article 9(2)(b) explicitly requires providers to estimate and evaluate the risks that may emerge both when the system is used in accordance with its intended purpose and under conditions of reasonably foreseeable misuse. Ignoring misuse scenarios is a common cause of non-compliance findings — providers should document the misuse scenarios considered, why they are considered reasonably foreseeable, and the measures addressing them.
Failing to establish a compliant risk management system is a breach of Article 9 and of the broader provider obligations under Article 16. Under Article 99(4), non-compliance with provider obligations can attract administrative fines of up to 15,000,000 EUR or, if higher, up to 3 percent of total worldwide annual turnover for the preceding financial year. SMEs are capped at the lower of the two amounts under Article 99(6).
No. The template is a structured starting point aligned with Article 9. The legal obligation is to establish, maintain and operate a compliant continuous risk management system covering Article 9(1) to (9) throughout the entire lifecycle of the high-risk AI system; using this particular template is not required by law. Witness does not warrant that filling in this template alone is sufficient to discharge the Article 9 obligation.