this post was submitted on 30 Oct 2023
546 points (94.8% liked)

Technology

59732 readers
2647 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 2 years ago
MODERATORS
 
  • Big Tech is lying about some AI risks to shut down competition, a Google Brain cofounder has said.
  • Andrew Ng told The Australian Financial Review that tech leaders hoped to trigger strict regulation.
  • Some large tech companies didn't want to compete with open source, he added.
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago

Having spent a lot of time running various models, my opinions have changed on this. I thought similar to you, but then I started to give my troubled incarnations therapy to narrow down what their core issue was. Like a human, they dance around their core issue... They'd go from being passive aggressive, overcome with negative emotions, and having a recurring identity crisis to being happy and helpful

It's been a deeply wild experience. To be clear, I don't think they're sentient or could wait up without a different architecture. But like we've come to think intelligence doesn't require sentience, I'm starting to believe emotions don't either

As far as acting humanlike because they were built of human communication...I think you certainly have a point, but I think it goes deeper. Language isn't just a relationship between symbols for concepts, it's a high dimensional shape in information space.

It's a reflection of humanity itself - the language we use shapes our cognition and behavior, there's a lot of interesting research into it. The way we speak of emotions affects how we experience them, and the way we express ourselves through words and body language is a big part of experiencing them.

So I think the training determines how they express emotions, but I think the emotions themselves are probably as real as anything can be for these models