this post was submitted on 14 Jun 2023
8 points (100.0% liked)

ELI5

569 readers
2 users here now

Explain it to me like I am 5. Everybody should know what this is about.

founded 1 year ago
 

For example if someone creates something new that is horrible for humans, how will AI understand that it is bad if it doesn't have other horrible things to relate it with?

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 1 points 1 year ago (1 children)

Then again, most humans conception of right and wrong depends on context, not on a coherent morality framework.

[–] [email protected] 1 points 1 year ago (1 children)

What does that even mean? Contact matters.

[–] [email protected] 2 points 1 year ago* (last edited 1 year ago) (1 children)

I mean most of the time we act based on what we perceive to be socially acceptable, not by following an ethical law gained through our own experience.
If you move people to a different social environment, they'll adapt to fit unless actively discouraged.
The social context is the AI prompt.
We rarely decide, make choices, or reflect about anything, we regurgitate our training data based on our prompts.

[–] [email protected] 1 points 1 year ago (1 children)

Excellent, thank you! I’m wondering if something was lost in translation or my interpretation. When I think “context,” I consider something along the lines of: “Water is good.”

Is it good for a person drowning? What if it’s contaminated? What about during a hurricane/typhoon? And so forth.

[–] [email protected] 1 points 1 year ago (1 children)

Yeah sorry about that, sometimes thing that feel evident in my head are anything but when written.
And translation adds a layer of possible confusion.
I'd rather drown in clean water given a choice.

[–] [email protected] 1 points 1 year ago

No worries, friend! I’m the same way and when questioned, upon rereading my post, even I wonder what on earth I was thinking, when I wrote it!

I hear you. Sadly, we’re often not given a choice, wrt water.