Sprint-Level Compliance Activities
The compliance framework integrates with agile practices rather than imposing a waterfall overlay. The Technical Owner embeds compliance activities in the sprint cadence as native tasks. Each sprint includes updating relevant AISDP modules for design decisions made during the sprint, running the full test suite (including fairness and robustness gates) as part of the definition of done, reviewing new risks identified during development and adding them to the risk register, and updating the evidence pack with artefacts produced during the sprint.
The sprint retrospective includes a compliance dimension: what evidence was generated, what gaps remain, what risks were introduced. This cadence ensures that compliance evidence accumulates naturally through the development process rather than being assembled retrospectively under time pressure.
Compliance tasks are visible in the sprint backlog, estimated alongside feature work, and tracked through the same workflow. A separate compliance workstream that runs in parallel but disconnected from the sprint cadence creates a documentation lag that compounds over multiple sprints.
Key outputs
- Compliance tasks embedded in sprint backlog and definition of done
- Per-sprint AISDP updates and evidence pack additions
- Sprint retrospective compliance dimension
- Continuous evidence accumulation through development
Incremental AISDP Assembly
The AI System Assessor assembles the AISDP incrementally throughout development, beginning from Phase 1. Module 1 (System Identity) is completed during Phase 1. Module 6 (Risk Management) is drafted during Phase 2 and updated continuously. Module 3 (Architecture) is populated during Phase 3 and refined as the architecture evolves. Module 4 (Data Governance) grows as the data engineering work progresses.
By the time Phase 5 arrives, the AISDP should be substantially complete, requiring only final review and consistency checking. This incremental approach avoids the common failure mode of attempting to write the entire AISDP in the weeks before deployment, when time pressure leads to superficial documentation and missed requirements.
The module-by-phase mapping provides a clear schedule: each module has a phase in which it is primarily authored, subsequent phases in which it is refined, and a final review in Phase 5. The Conformity Assessment Coordinator tracks module completion status against this mapping, flagging modules that fall behind schedule.
Key outputs
- Module-by-phase authoring schedule
- Substantially complete AISDP by Phase 5
- Completion status tracking by Conformity Assessment Coordinator
- Avoidance of last-minute documentation sprint
Feature Flags Within Compliance Boundaries
Agile teams frequently use feature flags to deploy partially complete features behind toggles. For high-risk AI systems, feature flags operate within the compliance framework. A flag that enables a new model version, a new data source, or a new decision pathway is a system change that the AI System Assessor assesses against the substantial modification thresholds (Article 3(23)).
Feature flag configuration is version-controlled alongside the system’s code and configuration. The engineering team logs each flag activation in the deployment ledger. The assessment against substantial modification thresholds occurs before the flag is activated in production, not after; a flag activation that constitutes a substantial modification triggers the conformity re-assessment pathway.
Feature flags that control non-AI aspects of the system (UI presentation, logging verbosity, non-model configuration) are managed through standard engineering governance and do not require compliance assessment.
Key outputs
- Feature flags assessed against substantial modification thresholds before activation
- Version-controlled flag configuration
- Flag activations logged in deployment ledger
- Non-AI flags excluded from compliance assessment
Continuous Conformity Assessment — CI/CD as Automated Checking
The organisation checks conformity continuously throughout development, spreading the assessment workload across the full lifecycle. The CI/CD pipeline’s quality gates provide automated continuous checking: every commit triggers static analysis, testing, and compliance verification. Manual assessment activities (documentation review, evidence verification) are conducted by the Conformity Assessment Coordinator at defined milestones rather than concentrated at the end.
This approach reduces the risk of discovering fundamental non-conformities late in the development cycle when remediation is costly and time-constrained. A non-conformity identified in sprint 3 costs a fraction of the same non-conformity identified during Phase 5’s formal assessment.
The continuous checking model complements the formal Annex VI assessment; it does not replace it. The formal assessment in Phase 5 examines the complete AISDP against all requirements. Continuous checking ensures that the formal assessment is unlikely to uncover surprises.
Key outputs
- Automated conformity checking through CI/CD quality gates
- Milestone-based manual assessment throughout development
- Early non-conformity detection reducing remediation cost
- Continuous checking complementing (not replacing) formal Annex VI assessment