Definition
In tech and AI contexts, the principle that algorithms and systems should treat all users equitably without bias or discrimination—a surprisingly difficult goal when training data reflects humanity's historical prejudices. It's the ethical ideal that your credit score algorithm shouldn't care about your zip code, even though it probably does. The tech industry's attempt to make machines better at equality than humans are.
Example Usage
The hiring algorithm was tested for fairness, but somehow still rejected candidates with ethnic-sounding names at higher rates—weird how that keeps happening.
Source: AI ethics and technology terminology
Related Terms
Translate This Term
See “fairness” in Corporate Speak, Gen-Z Slang, Pirate Speak, and more.
Try the Translator