Out-of-Distribution Detection
OOD
The ability of an AI system to identify when new inputs significantly differ from the training data distribution, signalling reduced reliability of predictions.
In Plain Language
The AI recognising when it's seeing something very different from what it was trained on; and flagging it. Like a doctor saying, "I've never seen symptoms like this before; let me refer you to a specialist."
Why This Matters
OOD detection is a governance safeguard that prevents AI systems from making confident predictions in situations they are not equipped to handle. Your governance framework should require OOD detection for all high-risk AI deployments.
.png)
