this post was submitted on 03 Dec 2024
599 points (98.9% liked)

Microblog Memes

5939 readers
5215 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 91 points 1 day ago (3 children)

10% false positives is an enormous rate given how Police like to ‘find’ evidence and ‘elicit’ confessions.

[–] spankmonkey 36 points 1 day ago (2 children)

It isn't predicting individual crimes, just pattern recognition and extrapolation like how the weather is predicted.

"There are on average 4 shootings in November in this general area so there probably will be 4 again this year." is the kind of prediction that AI is making.

[–] Death_Equity 14 points 23 hours ago

Predictions with hallucinations is exactly how an effective justice system works.

[–] [email protected] 2 points 16 hours ago (1 children)

So they are using it to try and decide on deployment?

If that is all they're using it for I guess it isn't too bad. As long as it isn't accusing individuals of planning to commit a crime with a zero evidence.

[–] spankmonkey 5 points 13 hours ago* (last edited 13 hours ago) (1 children)

It is probably going to be used to justify the disproportionate police attention paid to minority communities and to justify activities similar to stop and frisk.

[–] [email protected] 2 points 4 hours ago

The thing is don't they already have crime stats? Presumably they're already using them as justification so this won't change much.

[–] [email protected] 13 points 1 day ago

lol .... if AI is wrong .... we'll make it right

[–] [email protected] 4 points 1 day ago

That's in a punitive system. Used in a transformative/preventive manner (which it will not), this can actually save lives and help people in need