this post was submitted on 24 Aug 2023
445 points (88.3% liked)

Technology

59589 readers
6400 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Google's AI-driven Search Generative Experience have been generating results that are downright weird and evil, ie slavery's positives.

you are viewing a single comment's thread
view the rest of the comments
[–] scarabic 49 points 1 year ago (3 children)

If it’s only as good as the data it’s trained on, garbage in / garbage out, then in my opinion it’s “machine learning,” not “artificial intelligence.”

Intelligence has to include some critical, discriminating faculty. Not just pattern matching vomit.

[–] samus12345 18 points 1 year ago* (last edited 1 year ago) (2 children)

We don't yet have the technology to create actual artificial intelligence. It's an annoyingly pervasive misnomer.

[–] FlyingSquid 7 points 1 year ago (1 children)

And the media isn't helping. The title of the article is "Google’s Search AI Says Slavery Was Good, Actually." It should be "Google’s Search LLM Says Slavery Was Good, Actually."

[–] samus12345 9 points 1 year ago

Yup, "AI" is the current buzzword.

[–] [email protected] 3 points 1 year ago

Hey, just like blockchain tech!

[–] profdc9 9 points 1 year ago (1 children)

Unfortunately, people who grow up in racist groups also tend to be racist. Slavery used to be considered normal and justified for various reasons. For many, killing someone who has a religion or belief different than you is ok. I am not advocating for moral relativism, just pointing out that a computer learns what is or is not moral in the same way that humans do, from other humans.

[–] scarabic 5 points 1 year ago (1 children)

You make a good point. Though humans at least sometimes do some critical thinking between absorbing something and then acting it out.

[–] Daft_ish 2 points 1 year ago

Not enough. Not enough.

[–] Kahlenar 6 points 1 year ago

Scathing and accurate when your point is made about people too.