Historical Bias
Bias that exists in the real world and is reflected in training data, where past societal inequities are encoded and potentially amplified by AI models.
In Plain Language
When real-world unfairness from the past is baked into your data. If historical hiring data shows men were promoted more often, an AI trained on it will learn to prefer men.
Why This Matters
Historical bias is particularly important for AI governance because it means that even accurate historical data can produce unfair AI systems. Your governance framework should require assessment of whether training data encodes historical inequities.
.png)
