this post was submitted on 22 Feb 2024
488 points (96.2% liked)

Technology

59594 readers
2971 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google apologizes for ‘missing the mark’ after Gemini generated racially diverse Nazis::Google says it’s aware of historically inaccurate results for its Gemini AI image generator, following criticism that it depicted historically white groups as people of color.

you are viewing a single comment's thread
view the rest of the comments
[–] OhmsLawn 8 points 9 months ago (2 children)

It's really a failure of one-size-fits-all AI. There are plenty of non-diverse models out there, but Google has to find a single solution that always returns diverse college students, but never diverse Nazis.

If I were to use A1111 to make brown Nazis, it would be my own fault. If I use Google, it's rightfully theirs.

[–] fidodo 1 points 9 months ago

The solution is going to take time. Software is made more robust by finding and fixing edge cases. There's a lot of work to be done to find and fix these issues in AI, and it's impossible to fix them all, but it can be made better. The end result will probably be a patchwork solution.

[–] PopcornTin 0 points 9 months ago

The issue seems to be the underlying code tells the ai if some data set has too many white people or men, Nazis, ancient Vikings, Popes, Rockwell paintings, etc then make them diverse races and genders.

What do we want from these AIs? Facts, even if they might be offensive? Or facts as we wish they would be for a nicer world?