v2.4.0 | Report Errata
docs development docs development

The stochastic nature of LLM outputs, where the same input may produce different outputs across invocations, requires specific attention in the context of Article 15’s accuracy requirements and Article 12’s logging requirements. The AISDP must document the controls applied to manage this stochasticity.

For accuracy compliance under Article 15, the organisation must establish that the system’s outputs fall within acceptable bounds despite stochasticity. Temperature clamping reduces output variance; setting temperature to zero or near-zero produces more deterministic behaviour, though it may reduce output quality for generative tasks. The AISDP documents the temperature setting, the rationale for the chosen value, and the measured impact on output variance and quality.

For record-keeping under Article 12, each inference must be logged with sufficient context to enable reconstruction. Unlike deterministic models where the input and model version are sufficient to reproduce the output, stochastic models require logging the actual output alongside the input, the model version, and any runtime parameters (temperature, top-p, random seed if applicable). This ensures that every decision can be examined after the fact, even though it cannot be reproduced deterministically.

Seed fixing supports reproducibility in testing and evaluation environments. By fixing the random seed, the organisation can reproduce specific outputs for validation, debugging, and conformity assessment. The AISDP documents the seed management approach and clarifies that production inference may not use fixed seeds (to avoid gaming or predictability risks), while evaluation and testing environments do.

Output logging must capture not merely the final output but any intermediate reasoning steps, chain-of-thought outputs, or retrieval context that contributed to the response. For RAG-based systems, the retrieved documents and their relevance scores form part of the audit trail. The logging infrastructure must handle the substantially larger payload sizes that LLM outputs generate compared to traditional model predictions.

Key outputs

  • Stochastic output handling specification
  • Temperature and sampling parameter documentation
  • Logging architecture for LLM inference (AISDP Module 10)
On This Page