this post was submitted on 24 Mar 2024
36 points (81.0% liked)
[Outdated, please look at pinned post] Casual Conversation
6590 readers
1 users here now
Share a story, ask a question, or start a conversation about (almost) anything you desire. Maybe you'll make some friends in the process.
RULES
- Be respectful: no harassment, hate speech, bigotry, and/or trolling
- Encourage conversation in your post
- Avoid controversial topics such as politics or societal debates
- Keep it clean and SFW: No illegal content or anything gross and inappropriate
- No solicitation such as ads, promotional content, spam, surveys etc.
- Respect privacy: Don’t ask for or share any personal information
Related discussion-focused communities
founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
I'm human. And I care first and foremost about my own kin - other human beings. The "worst crime ever" [with crime = immorality] for me is human suffering, even in contrast with the suffering of other animals.
But even in the case of other animals, I'd probably be more concerned about their well-being than the one of the hypothetical AI.
Even then, it somewhat matters. Provided that what the AI is experiencing is relatable to what humans would understand as pain.
Suppose for the sake of the hypothetical we can plug a human brain into the same network, and offload a fraction of the consciousness to confirm the pain is equivalent, and it is not just comparable, but orders of magnitude greater than any human can suffer.
You say you care about other human beings most. So I have two questions for you.
Q1: Which is worse, one person having a finger nail pulled out with a pair of pliers, or a cat being killed with a knife?
Q2: (I'm assuming you answered killing the cat is worse) how many people need to lose finger nails until it becomes worse? 10? 100?
A1: if I know neither the person nor the cat, and there's no further unlisted suffering, then the fingernail pulling is worse.
The answer however changes based on a few factors - for example I'd put the life of a cat that I know above Hitler's fingernail. And if the critter was another primate I'd certainly rank its death worse.
A2: I'll flip the question, since my A1 wasn't what you expected:
I'm not sure on the exact number of cat deaths that, for me, would become worse than pulling the fingernail off a human. But probably closer to 100 than to 10 or 1k.
Within the context of your hypothetical AI: note that the cat is still orders of magnitude closer to us than the AI, even if the later would be more intelligent.
Thanks for taking the intuitive to flip the question.
The next question is: what metric are you using to determine that 100 cat deaths is roughly equivalent to one person having a fingernail pulled out? Why 100? Why not a million?
Do you think there is an objective formula to determine how much suffering is produced by?
I'm not following any objective formula, nor aware of one. (I would, if I could.) I'm trying to "gauge" it by subjective impact instead.
[I know that this answer is extremely unsatisfactory and I apologise for it.]