this post was submitted on 01 Aug 2023
524 points (82.1% liked)

Technology

62679 readers
2650 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

(page 2) 50 comments
sorted by: hot top controversial new old
[–] [email protected] 7 points 2 years ago (5 children)

Rage bait to push the ethics in AI narrative

load more comments (5 replies)
[–] [email protected] 6 points 2 years ago

It reminds me of Google back in the day (probably early 2010s). If you searched for White Women, it returned professional and respectable images. But if you searched for Black Women, it returned explicit images.

Machine learning algorithms are like sponges and learn from existing social biases.

[–] LEDZeppelin 5 points 2 years ago* (last edited 2 years ago)
[–] EmotionalMango22 5 points 2 years ago

So? There are white people in the world. Ten bucks says she tuned it to make her look white for the clicks. I've seen this in person several times at my local college. People die for attention, and shit like this is an easy-in.

[–] [email protected] 4 points 2 years ago

That's funny!

[–] [email protected] 4 points 2 years ago* (last edited 2 years ago)

Like what some has already said here: it's a commentary of what Anglo-centric societies view as "professional" at the time the model is trained. Why Anglo-centric? By virtue that the US is the center of internet activity.

[–] [email protected] 3 points 2 years ago

Disappointing but not surprising. The world is full of racial bias, and people don't do a good job at all addressing this in their training data. If bias is what you're showing the model, that's exactly what it'll learn, too.

[–] [email protected] 3 points 2 years ago (1 children)

I wouldn't say "closer to Caucasian". She straight turned white

load more comments (1 replies)
load more comments
view more: ‹ prev next ›