v2.4.0 | Report Errata
docs development docs development

Version-Controlled Code, Data, Model, Config

This artefact encompasses the complete version-controlled estate of the AI system: the code repositories (Git), the data versions (DVC, Delta Lake, or LakeFS), the model registry entries, and the configuration-as-code repositories. Together, these form the evidential backbone of the AISDP’s traceability claims.

The artefact’s compliance value lies in its completeness and its linkage. Each artefact type is version-controlled individually, but the cross-references between them (a model entry referencing its training data version and code commit, a code commit referencing the data version it was validated against) create the navigable traceability chain. The composite version identifier ties these cross-references together at the system level.

For AISDP purposes, the evidence comprises the version control governance policy and branch protection configuration (Module 2), sample merge request records demonstrating the approval workflow in practice, repository configuration exports, and the version history excerpts for each artefact type. The complete version-controlled estate must be retained for the ten-year period under Article 18.

Key outputs

  • Complete version-controlled code, data, model, and config artefacts
  • Cross-referencing between artefact types
  • Version control governance policy and configuration exports
  • Module 2 and Module 10 evidence

Model Registry with Compliance Metadata

This artefact is the model registry itself, populated with the compliance metadata described above. It represents the authoritative record of every model version that has been trained, evaluated, deployed, or archived throughout the system’s lifecycle.

The registry’s value as a compliance artefact derives from the completeness and accuracy of its metadata. For each production model version, the registry should contain the full provenance chain (data version, code commit, pipeline execution), the complete validation gate results (performance, fairness, robustness, drift), the stage transition history with approval records, and the content hash for integrity verification.

The registry content feeds into Module 3 (as evidence of the model architecture and selection rationale) and Module 10 (as the record-keeping foundation for model-related traceability). A worked traceability example, demonstrating end-to-end provenance retrieval for a specific inference, should be prepared and retained as evidence that the traceability chain is functional.

Key outputs

  • Populated model registry with compliance metadata per version
  • Stage transition history with approval records
  • Worked traceability example demonstrating end-to-end provenance
  • Module 3 and Module 10 evidence

Substantial Modification Assessment Records

Every change assessed against the substantial modification thresholds produces a determination record. This artefact comprises the collection of all such records, forming the system’s change assessment history.

Each record documents which metrics changed and by how much, the root cause of the change, whether the cumulative baseline comparison was triggered, the determination (substantial modification or not), the rationale for the determination, the evidence reviewed (validation gate reports, baseline comparisons, impact analyses), and, if the change was determined to be a substantial modification, the re-assessment outcome. The records are retained for the ten-year period regardless of whether the determination was positive or negative.

The collection of records serves two purposes. For the organisation, it demonstrates a consistent and documented approach to change assessment, which is a QMS requirement. For regulatory inspectors and notified bodies, it provides transparency into how the system has evolved and how the organisation has governed that evolution. A gap in the assessment records, where changes were made without documented assessment, is a non-conformity.

Key outputs

  • Determination records for every change assessed against thresholds
  • Supporting evidence (gate reports, baseline comparisons, impact analyses)
  • Ten-year retention of all records
  • Module 6 and Module 12 AISDP evidence

Contract Test Results

Contract test results from the consumer-driven and statistical contract testing described in are retained as evidence that the system’s interfaces are functioning within their documented specifications. Each CI pipeline run produces contract test results; these are stored as pipeline artefacts and referenced in the AISDP.

The artefact comprises the contract definitions themselves (what each consumer expects from each provider), the test execution results for each pipeline run (pass/fail per contract, with details for failures), and any contract violations that were detected and the resolution actions taken. The contracts serve as executable documentation of the system’s interface assumptions; the test results demonstrate that those assumptions are verified continuously.

For AISDP Module 5, the contract test results complement the model validation gate results by demonstrating that the system’s components interact correctly, not just that the model produces acceptable outputs in isolation. A model that passes all validation gates but receives malformed inputs due to a broken upstream contract may still produce non-compliant outputs in production.

Key outputs

  • Contract definitions version-controlled alongside system code
  • Contract test execution results per CI pipeline run
  • Contract violation logs with resolution actions
  • Module 5 AISDP evidence

Deployment Ledger Entries

The deployment ledger entries are the materialised output of the deployment ledger described above. Each entry records a single deployment event: the before and after system state, the authoriser, the evidence reviewed, and the timestamp. The collection of entries forms the system’s deployment history.

For AISDP Module 10, the most recent deployment entries demonstrate the current system state and the governance that produced it. For Module 12, the complete deployment history provides the change log that tracks the system’s evolution. Inspectors and notified bodies may request deployment ledger entries for specific time periods to understand what changed, when, and under whose authority.

The entries must be immutable once created. Any correction or amendment to a deployment record is itself a new record referencing the original, not an in-place modification. This immutability ensures that the deployment history is a reliable audit trail, not a revisable narrative.

Key outputs

  • Immutable deployment ledger entries per deployment event
  • Before/after state, authoriser, evidence, and timestamp per entry
  • Complete deployment history accessible by time period
  • Module 10 and Module 12 AISDP evidence
On This Page