v2.4.0 | Report Errata
docs development docs development

Automated licence compliance scanning prevents the organisation from inadvertently using libraries with licence terms that conflict with the system’s deployment model. An ML system that uses an AGPL-licensed library may be required to open-source its own code; a system using a library with a non-commercial licence cannot be deployed commercially. These conflicts can emerge deep in the dependency tree, invisible without automated scanning.

FOSSA and Black Duck provide comprehensive automated licence analysis and conflict detection. For a lightweight approach, pip-licenses enumerates all Python dependency licences for review, and the pre-commit configuration can be set to fail on prohibited licence types (for example, AGPL-3.0 or GPL-3.0 where incompatible with the system’s distribution model).

The licence audit is particularly relevant for AI systems incorporating open-source model components, where licence terms may impose obligations on downstream use. The Technical SME documents the licence audit and retains it as Module 3 evidence. Any licence conflicts identified are resolved before deployment, either by replacing the conflicting dependency or by obtaining appropriate permissions.

Key outputs

  • Licence scanning tool configuration (FOSSA, Black Duck, or pip-licenses)
  • Prohibited licence list aligned with the system’s deployment model
  • CI pipeline integration blocking builds on licence conflicts
  • Module 3 and Module 9 evidence
On This Page