171
In NYC, companies will have to prove their AI hiring software isn't sexist or racist
(www.nbcnews.com)
This is a most excellent place for technology news and articles.
Unequal outcomes isn’t evidence of bias.
Not inherently, but things can be tested.
If you have a bunch of otherwise identical résumés, with the only difference being the racial connotation of the name, and the AI gives significantly different results, there's an identifiable problem.
That makes sense: empirical tests of the AI as you describe.