this post was submitted on 30 Jan 2025
1031 points (96.9% liked)

Microblog Memes

6328 readers
3168 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 17 points 10 hours ago (1 children)

What technology does Google make that can be used for that?

[–] finder585 26 points 10 hours ago (3 children)

https://www.jpost.com/international/article-838681

TLDR; direct from the article:

Israel has been integrating AI into the military for several years, using the technology to process surveillance footage.

IDF claims

“If anything, these tools have minimized collateral damage and raised the accuracy of the human-led process.”

So, take it with a grain of salt.

[–] Keeponstalin 2 points 2 hours ago* (last edited 1 hour ago)

Here's more info on the AI that Israel uses.

According to intelligence sources, Habsora generates, among other things, automatic recommendations for attacking private residences where people suspected of being Hamas or Islamic Jihad operatives live. Israel then carries out large-scale assassination operations through the heavy shelling of these residential homes.

For example, sources explained that the Lavender machine sometimes mistakenly flagged individuals who had communication patterns similar to known Hamas or PIJ operatives — including police and civil defense workers, militants’ relatives, residents who happened to have a name and nickname identical to that of an operative, and Gazans who used a device that once belonged to a Hamas operative

AI system known as “Where’s Daddy?” tracked Palestinians on the kill list and was purposely designed to help Israel target individuals when they were at home at night with their families. The targeting systems, combined with an “extremely permissive” bombing policy in the Israeli military, led to “entire Palestinian families being wiped out inside their houses,” says Yuval Abraham, an Israeli journalist who broke the story after speaking with members of the Israeli military who were “shocked by committing atrocities.”

[–] [email protected] 31 points 9 hours ago (2 children)

Which in a way makes their killing of children even more intentional

[–] [email protected] 10 points 6 hours ago (1 children)

Seriously, Gaza is practically leveled and they're bragging about accuracy!

[–] drivepiler 3 points 5 hours ago

They very accurately targeted anything and everything in Gaza

[–] Duamerthrax 5 points 6 hours ago

It's absolutely intentional. There is no collateral damage because it's all intentional damage.

[–] [email protected] 8 points 10 hours ago