this post was submitted on 06 Jul 2023
163 points (98.2% liked)

Technology

61490 readers
4460 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

New York City businesses that use artificial intelligence to help find hires now have to show the process was free from sexism and racism.

top 17 comments
sorted by: hot top controversial new old
[–] [email protected] 20 points 2 years ago

Some overworked help desk tech is going to have a rough day trying to explain to their HR manager why this isn't just a yes or no question.

[–] [email protected] 15 points 2 years ago

You definitely don’t even need to include sex or race as an input for the AI to show bias. AI can find other things that tend to show sex or race… perhaps your school, perhaps your address, or perhaps the very style you tend to write in for your cover letter.

[–] [email protected] 8 points 2 years ago* (last edited 2 years ago)

Good fucking luck. Here's a fascinating article about Amazon's attempt to use AI for hiring, which to their credit, they realized was a bad idea and scrapped: https://www.reuters.com/article/us-amazon-com-jobs-automation-insight-idUSKCN1MK08G

In short, it was trained on past hiring data, so taught itself from sexist hiring preferences made by humans. It absolutely not designed to be sexist and I'm sure the devs had good intentions, but it taught itself how to be sexist.

In effect, Amazon’s system taught itself that male candidates were preferable. It penalized resumes that included the word “women’s,” as in “women’s chess club captain.” And it downgraded graduates of two all-women’s colleges, according to people familiar with the matter. They did not specify the names of the schools.

And here's a different but similar AI having some even subtler issues:

[..] The algorithms learned to assign little significance to skills that were common across IT applicants, such as the ability to write various computer codes, the people said.

Instead, the technology favored candidates who described themselves using verbs more commonly found on male engineers’ resumes, such as “executed” and “captured,” one person said.

To be very clear, these issues stem at their root from human biases, so not using an AI is not going to save you from bias and in fact may well be even more biased because at least AI can be the work of entire teams doing their best to combat bias. But it can end up discriminating in very subtle and unfair ways, like how it was penalizing certain schools. It can end up perpetuating past bad behavior and make it harder to improve.

Finally, this article is about Amazon noticing these biases and actively trying to correct them. This law is a good thing, because otherwise many companies won't even do that. While still imperfect, Amazon could have played whackamole trying to root out biases (it sounds like they did for a while before giving up). Many companies won't even do that, so we need laws like this to force them to at least do so. Of course, ideally anti bias laws would also apply to humans, since we are just as vulnerable.

[–] [email protected] 4 points 2 years ago

AI shouldn't be involved in hiring or firing decisions as it can't be held accountable in the same way a human can. Yes, it is more efficient. But equity, not efficiency, should be the goal.

[–] [email protected] 4 points 2 years ago

I wonder what the baseline they are comparing against? Current hiring trends or a set baseline of percentage. Or some basic changes to make a resume more racist

[–] MajorHavoc 4 points 2 years ago

Narrator: They could not.

[–] [email protected] 2 points 2 years ago (1 children)

Can't prove the human ones were not... do we send a witch hunt for them too?

[–] NOT_RICK 13 points 2 years ago (1 children)

You are aware companies have been caught and punished for racist hiring practices, right? Cracker Barrel is a good example.

[–] [email protected] 4 points 2 years ago

No shit sherlock? But we're talking about NYC and making some assumption that code is going to judge people worse than humans. As a native New Yorker, code developed by an aryan will be less racist than people in general in NYC. There is no default requirement to prove that HR hiring practices are not sexist or racist... and oftentimes it blatantly is. It's only when it gets challenged in court that it's ever noticed.

This is simply NYC trying to siphon money from the tech industry. Not some altruistic attempt to crush racism or sexism.

[–] [email protected] 1 points 2 years ago

Lol repost of https://lemmy.world/post/1101721 from the same community. Just goes to show you you can still have duplicates even if there's only one community.

load more comments
view more: next ›