Learn Before
what does it mean for an algorithm to be fair?
One sense of fairness might be that the algorithm doesn’t take into account certain protected information, such as race or gender. Another sense of fairness might be that the algorithm is similarly accurate for different groups. Another notion of fairness might be that the algorithm is calibrated for all groups. In other words, it doesn’t overestimate or underestimate the risk for any group. Interestingly, any approach that hopes to guarantee this property, might have to look at the protected information. So there are clearly some cases in which ensuring one notion of fairness might come at the expense of another.
In a recent paper, Professor Jon Kleinberg gave an impossibility theorem for fairness in determining risk scores. He shows that three intuitive notions of fairness are not reconcilable except in unrealistically constrained cases. So it might not be enough simply to demand that algorithms be fair. We may need to think critically about each problem and determine which notion of fairness is most relevant.
0
2
Tags
Data Science