Free template — Article 9

Free Risk Management System Template

An editable Risk Management System aligned with every requirement of Article 9 of the EU AI Act — no sign-up, no consultant fees, no guesswork.

Aligned with

Providers of high-risk AI systems save consultant fees; the template covers all seven Article 9 obligations and cites the relevant sub-paragraphs inline so auditors can trace each section back to the regulation.

Who must establish a Risk Management System?

Article 9(1) requires every provider of a high-risk AI system to establish, implement, document and maintain a risk management system. Article 9(2) adds that it must run as a continuous iterative process across the entire lifecycle.

  • Providers of high-risk AI systems under Article 6(1) or Article 6(2) — including systems listed in Annex III
  • All such providers, throughout the entire lifecycle of the system — not a one-off deliverable (Article 9(2))
  • Systems in Annex III areas (biometrics, critical infrastructure, education, employment, essential services, law enforcement, migration, justice)
  • Safety components of products in scope of the Annex I Union harmonisation legislation

A Risk Management System is not required for non-high-risk systems, but deployers of high-risk systems still have their own obligations under Article 26 and, where applicable, Article 27 (FRIA).

What the template includes

Seven structured sections aligned one-to-one with the Risk Management generator inside Witness, so you can start on paper and migrate to the online tool without losing work.

  1. Risk identification

    Art. 9(2)(a)

    Known and reasonably foreseeable risks to health, safety and fundamental rights when the system is used in accordance with its intended purpose.

  2. Risk estimation and evaluation

    Art. 9(2)(b)

    Likelihood and severity for intended use and reasonably foreseeable misuse, plus the evaluation methodology.

  3. Risk management measures

    Art. 9(3), 9(5)

    Elimination, reduction, mitigation and information measures — considering the effect of their combined application.

  4. Residual risk and deployer communication

    Art. 9(5)

    Residual risks judged acceptable and communicated to deployers, including the relevant overall residual risk.

  5. Testing

    Art. 9(6)–(8)

    Testing procedures, performance metrics, prior defined probabilistic thresholds and testing cadence — including real-world testing where appropriate.

  6. Continuous iteration and review

    Art. 9(2)

    Lifecycle plan, update triggers, review schedule, documentation control and integration with Article 72 post-market monitoring.

  7. Children and vulnerable groups

    Art. 9(9)

    Specific consideration for children and other vulnerable groups where the system is likely to affect them.

Download or preview

Open the template in your browser to preview it, print it straight to PDF from the browser menu, or save it locally to edit offline.

HTML template, approximately 12 kB. Prints cleanly to PDF via your browser.

Upgrade path

The online generator beats a static template

The downloadable template gets you started. The guided Risk Management generator inside Witness takes you the rest of the way.

AI prefill from your classifier answers

Answers from the AI System Classifier flow into the Risk Management sections so you start from an 80 percent draft, not a blank page.

Legal citations on every field

Each field links to the exact Article 9 sub-paragraph it implements, with EUR-Lex references available inline.

Word and PDF export, plus version history

Submit the completed RMS and download a formatted deliverable. Version history captures every iteration — mandatory under Article 9(2).

Open the online Risk Management generator

Requires a Witness account with Professional access and active annual maintenance.

Frequently asked questions

Who must establish a Risk Management System?

Article 9(1) requires every provider of a high-risk AI system to establish, implement, document and maintain a risk management system. This applies to high-risk systems classified under Article 6(1) (safety components of products regulated under Annex I harmonisation legislation) and Article 6(2) (systems listed in Annex III). Deployers of high-risk systems have their own obligations under Article 26 but are not themselves required to run an Article 9 RMS for systems they only use.

When must the Risk Management System be in place?

Article 9(2) requires the risk management system to run throughout the entire lifecycle of the high-risk AI system. It is not a one-off deliverable. Providers must establish the RMS before placing the system on the market or putting it into service and must continue to operate it as a continuous iterative process while the system remains available — including during the design, development, deployment and post-market phases.

How often must the Risk Management System be reviewed?

Article 9(2)(e) requires the risk management system to be regularly and systematically reviewed and updated based on new testing results, changes in the state of the art, and data gathered through the Article 72 post-market monitoring system. There is no fixed minimum frequency in the regulation, but a documented review cadence — typically quarterly or at least annually — combined with event-driven updates (new incident, substantial modification, new deployment context) is the expected standard.

Must we include reasonably foreseeable misuse?

Yes. Article 9(2)(b) explicitly requires providers to estimate and evaluate the risks that may emerge both when the system is used in accordance with its intended purpose and under conditions of reasonably foreseeable misuse. Ignoring misuse scenarios is a common cause of non-compliance findings — providers should document the misuse scenarios considered, why they are considered reasonably foreseeable, and the measures addressing them.

What happens if we skip the Risk Management System?

Failing to establish a compliant risk management system is a breach of Article 9 and of the broader provider obligations under Article 16. Under Article 99(4), non-compliance with provider obligations can attract administrative fines of up to 15,000,000 EUR or, if higher, up to 3 percent of total worldwide annual turnover for the preceding financial year. SMEs are capped at the lower of the two amounts under Article 99(6).

Is this template legally binding?

No. The template is a structured starting point aligned with Article 9. The legal obligation is to establish, maintain and operate a compliant continuous risk management system covering Article 9(1) to (9) throughout the entire lifecycle of the high-risk AI system; using this particular template is not required by law. Witness does not warrant that filling in this template alone is sufficient to discharge the Article 9 obligation.