this post was submitted on 01 Aug 2023
525 points (82.1% liked)

Technology

59578 readers
6111 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

An Asian MIT student asked AI to turn an image of her into a professional headshot. It made her white with lighter skin and blue eyes.::Rona Wang, a 24-year-old MIT student, was experimenting with the AI image creator Playground AI to create a professional LinkedIn photo.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 10 points 1 year ago* (last edited 1 year ago)

Honestly it's just not being used correctly. I actually believe this is just user error.

These AI image creators rely on the base models they were trained with and more than likely were fed wayyyyy more images of Caucasians than anyone else. You can add weights to what you would rather see in your prompts, so while I'm not experienced with the exact program she used, the basics should be the same.

You usually have 2 sections, the main prompt (positive additions) and a secondary prompt for negatives, things you don't want to see. An example prompt could be "perfect headshot for linked in using supplied image, ((Asian:1.2))" Negative: ((Caucasian)), blue eyes, blonde, bad eyes, bad face, etc....

If she didn't have a secondary prompt for negatives I could see this being a bit more difficult, but still there are way better systems to use then. If she didn't like the results from the one she used instead of jumping to "AI racism!" she could have looked up what other systems exist. Hell, with the model I use with Automatic1111 I have to put Asian in my negatives because it defaults to that often.

Edit: figures I wrote all this then scrolled down and noticed all the comments saying the same thing lol at least we're on the same page