v2.4.0 | Report Errata
docs governance docs governance

Phase 4 builds the system in accordance with the approved architecture, with compliance evidence generated as a natural byproduct of the engineering workflow. Development uses version-controlled code, model, and data artefacts. The CI/CD pipeline enforces quality gates at every commit: static analysis (including AI-specific rules), unit testing, contract testing, dependency and licence scanning, and secret detection.

Data engineering follows the pre-step/post-step capture methodology, with each transformation documented before execution and verified after. Model training, validation, and testing follow the documented methodology; performance, fairness, robustness, and calibration metrics are computed and recorded. The model validation gate blocks promotion of any model that fails AISDP-declared thresholds. The human oversight interface is developed with automation bias countermeasures, mandatory review workflows, and override capability.

Cybersecurity testing is integrated throughout: SAST and DAST in the pipeline, dependency scanning, container image scanning, infrastructure-as-code scanning, and adversarial ML testing. Phase 4 produces continuously: version-controlled artefacts, automated test reports, model cards (auto-generated), data quality reports, training pipeline logs, and cybersecurity scan results.

Key outputs

  • Version-controlled code, model, and data artefacts with full audit trail
  • Automated test reports (unit, integration, regression, fairness, robustness)
  • Model cards, data quality reports, cybersecurity scan results
  • Gates: model validation (automated), security review (manual), integration test pass
On This Page