this post was submitted on 25 Nov 2023
776 points (96.7% liked)

Technology

59200 readers
3846 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] FlyingSquid 3 points 11 months ago (1 children)

Israeli general: Captain, were you responsible for reprogramming the drones to bomb those ambulances?

Israeli captain: Yes, sir! Sorry, sir!

Israeli general: Captain, you're just the sort of man we need in this army.

[–] [email protected] 0 points 11 months ago* (last edited 11 months ago) (1 children)

Ah, evil people exist and therefore we should never develop technology that evil people could use for evil. Right.

[–] FlyingSquid 3 points 11 months ago (1 children)

Seems like a good reason not to develop technology to me. See also: biological weapons.

[–] [email protected] 0 points 11 months ago (2 children)

Those weapons come out of developments in medicine. Technology itself is not good or evil, it can be used for good or for evil. If you decide not to develop technology you're depriving the good of it as well. My point earlier is to show that there are good uses for these things.

[–] FlyingSquid 3 points 11 months ago

Hmm... so maybe we keep developing medicine but not as a weapon and we keep developing AI but not as a weapon.

Or can you explain why one should be restricted from weapons development and not the other?

[–] [email protected] 1 points 11 months ago

I disagree with your premise here. Taking a life is a serious step. A machine that unilaterally decides to kill some people with no recourse to human input has no good application.

It's like inventing a new biological weapon.

By not creating it, you are not depriving any decent person of anything that is actually good.