this post was submitted on 01 Aug 2023
524 points (82.1% liked)

Technology

62017 readers
4668 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

you are viewing a single comment's thread
view the rest of the comments
[–] sirswizzlestix 39 points 2 years ago (23 children)

These biases have always existed in the training data used for ML models (society and all that influencing the data we collect and the inherent biases that are latent within), but it’s definitely interesting that generative models now make these biases much much more visible (figuratively and literally with image models) to the lay person

[–] SinningStromgald 5 points 2 years ago (22 children)

But they know the AI's have these biases, at least now, shouldn't they be able to code them out or lessen them? Or would that just create more problems?

Sorry, I'm no programer so I have no idea if thats even possible or not. Just sounds possible in my head.

[–] CharlestonChewbacca 12 points 2 years ago (1 children)

That's not how it works. You don't just "program out the biases" you have to retain the model with more inclusive training data.

[–] [email protected] 2 points 2 years ago (1 children)

even then, it will always move towards the "average" of all combined training data, unless prompted otherwise.

[–] CharlestonChewbacca -2 points 2 years ago

No, that's not how it works either.

load more comments (20 replies)
load more comments (20 replies)