this post was submitted on 20 Nov 2024
51 points (90.5% liked)

Technology

59525 readers
3538 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Israeli forces are using an AI weapons system in Gaza co-produced by an Indian defence company that turns machine guns and assault rifles into computerised killing machines, Middle East Eye can reveal.

According to documents and news reports seen by MEE, Israeli forces have been using the Arbel weapons system in Gaza following their devastating invasion of the enclave after the 7 October attacks on southern Israel.

Touted as a "revolutionary game changer that improves operator lethality and survivability," the Arbel system enhances machine guns and assault weapons - such as the Israeli-produced Tavor, Carmel and Negev - into a weapon that uses algorithms to boost soldiers chances of hitting targets more accurately and efficiently.

Although defence analysts say the weapon system may not be as cutting-edge or as widely used as the "Lavender" or "The Gospel" AI weapons systems - that are reported to have played a huge role in the tremendous death toll in Gaza - Arbel appears to be the first weapons system to directly tie India to Israel's rapidly expanding AI war in Gaza in what could have wide-ranging implications for other conflicts.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 18 points 20 hours ago (3 children)

Am I reading the techno-babble accurately?

You would muzzle sweep your target with the trigger pressed, and it would fire as your gun is actually aimed for the most-probable strike. If off target = it doesn't fire = bullets saved and probably better targeting because the shooter isn't dealing with as much recoil.

So even less training needed and even further removed from human decision making. Soldiers didn't murder that unarmed civilian, AI did.

[–] agelord 5 points 16 hours ago (1 children)

The AI didn't press the trigger, soldiers did.

[–] NeoNachtwaechter 11 points 20 hours ago (2 children)

Soldiers didn't murder that unarmed civilian, AI did.

Soldier did the (rough) aiming.
Soldier pulled the trigger.

Still hard to blame AI for it, don't you think?

[–] [email protected] 9 points 19 hours ago

For reasonable people. For others, it's an avenue to get away with war crimes.

[–] just_another_person 7 points 20 hours ago

Yes, you're relying on an offline inference device to make trigger choices. Basically "if brown, shoot" from what I gather.