v2.4.0 | Report Errata
docs artefact-taxonomy docs artefact-taxonomy

Taxonomy Overview The 61 artefacts sit along a spectrum from pure engineering output to formal legal instrument. Five categories capture the meaningful distinctions. Artefacts on the left of the spectrum are generated as byproducts of building and running the system; artefacts on the right carry legal force or are submitted directly to regulators. Category A (engineering work-product, 16 artefacts) contains items generated automatically by the development and operations pipeline. These exist because engineers build systems, not because regulators require documentation. Pipeline execution logs, model registry entries, and SBOMs fall here. Category B (compliance evidence, 14 artefacts) contains work-products specifically retained, structured, or assembled to substantiate a regulatory claim. Often auto-generated, their format, retention period, and traceability metadata are shaped by compliance requirements. Evidence packs, distributional analysis reports, and dataset documentation belong here. Category C (governance decision records, 12 artefacts) contains internal records of decisions, approvals, risk acceptance, and ongoing management. These document the organisation’s reasoning rather than the system’s technical state. The CDR, risk register, and residual risk sign-offs sit in this category. Category D (assessment records, 8 artefacts) contains artefacts of the formal conformity assessment process. These are produced during or for the structured evaluation that precedes the Declaration of Conformity. Assessment plans, checklists, and the assessment report belong here. Category E (regulatory instruments, 11 artefacts) contains documents with legal force: submitted to authorities, shared with deployers as formal outputs, or directly referenced by regulation. The Declaration of Conformity, serious incident reports, and the AISDP itself fall in this category. Key outputs

  • Five-category taxonomy with placement criteria
  • Per-category character description

Spectrum Visualisation The spectrum runs from engineering work-product (left) to regulatory instrument (right). Each column represents one taxonomy category; artefacts within a column share the same fundamental character. Category A artefacts would exist in some form even without the EU AI Act; they are CI/CD pipeline byproducts. Category B artefacts would exist in a less structured form; the ten-year retention, traceability metadata, and formal report structure are compliance additions. Category C artefacts would not exist without a compliance obligation; they document deliberate organisational choices. Category D artefacts exist solely because the Act mandates conformity assessment before market placement. Category E artefacts carry legal force; errors in these documents trigger Article 99 penalty exposure. The largest clusters appear at the engineering end (16 in Category A, 14 in Category B), reflecting the AISDP’s design philosophy that compliance artefacts are generated as CI/CD automation byproducts rather than standalone documentation exercises. Key outputs

  • Five-column spectrum mapping all 61 artefacts
  • Per-category artefact counts

Distribution and Design Philosophy The distribution is roughly balanced. Engineering work-product (16) and compliance evidence (14) together account for 30 artefacts, reflecting the principle that compliance documentation should emerge from the engineering workflow rather than require a separate documentation effort. Governance decision records (12) are the most legally consequential cluster in practice. A competent authority examining a system will look first at the AISDP and Declaration (Category E), but its investigation into organisational culpability will focus on the CDR, risk register, residual risk sign-offs, and provider boundary determination (Category C). These artefacts document what the organisation knew, when it knew it, and what judgements it made. Assessment records (8) form the smallest category but carry disproportionate weight. They are the direct evidentiary basis for the Declaration of Conformity. A deficient assessment report undermines the Declaration itself. Regulatory instruments (11) are the only artefacts that leave the organisation. They are seen by deployers, competent authorities, notified bodies, market surveillance authorities, and affected persons. Every other artefact exists behind the organisation’s walls until an authority requests access. Key outputs

  • Distribution analysis across five categories
  • Per-category regulatory significance assessment

Boundary Cases Three artefacts sit at the boundary between categories and could reasonably be placed in either. Each placement decision is documented with the rationale. The threat model (placed in B, could be A) presents the first boundary case. Threat modelling is standard security practice, placing it close to engineering work-product. The combined STRIDE/ATLAS/OWASP/PASTA framework and fundamental-rights impact scoring, however, are compliance-driven additions that push it into Category B. Break-glass procedure documentation (placed in E, could be C) sits at the second boundary. The operational motivation for shutdown procedures is strong, suggesting Category C governance. The design is directly shaped by Article 14’s requirement for the ability to stop, disable, or intervene, and the procedures are referenced in the Instructions for Use shared with deployers. This regulatory character places them in Category E. The FRIA report (placed in E, could be C) marks the third boundary. An internal assessment in character, the FRIA resembles other Category C governance records. Article 27(4) requires notification to the market surveillance authority, however, transforming it from an internal document into one with regulatory consequences. This notification requirement places it in Category E. Key outputs

  • Three boundary case analyses with placement rationale

Using the Taxonomy The taxonomy serves three practical purposes. First, it determines collection methodology: Category A artefacts are auto-generated by CI/CD pipelines, Category B artefacts are automated with human review or structured templates, Category C artefacts require human judgement, Category D artefacts follow the assessment workflow, and Category E artefacts require legal review and formal approval. Second, it guides retention and storage decisions. Categories A and B share the ten-year retention obligation under Article 18; their storage tiers reflect access frequency. Category C artefacts require immutable storage with audit trails because they document organisational knowledge. Category D artefacts are retained for ten years as the evidentiary basis for the Declaration. Category E artefacts have jurisdiction-specific requirements including translation and accessibility. Third, it informs quality review cycles. Category A artefacts are validated by pipeline gates; human review is exception-based. Category B artefacts undergo periodic completeness and freshness checks. Category C artefacts are reviewed at governance gates and quarterly. Category D artefacts are reviewed during the assessment cycle. Category E artefacts undergo legal review before each issuance. Key outputs

  • Per-category collection methodology guidance
  • Per-category retention and storage guidance
  • Per-category quality review cycle guidance
On This Page