this post was submitted on 03 Dec 2024
696 points (98.7% liked)

Microblog Memes

5984 readers
4697 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 1 year ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] spankmonkey 40 points 1 week ago (2 children)

It isn't predicting individual crimes, just pattern recognition and extrapolation like how the weather is predicted.

"There are on average 4 shootings in November in this general area so there probably will be 4 again this year." is the kind of prediction that AI is making.

[–] Death_Equity 17 points 1 week ago (1 children)

Predictions with hallucinations is exactly how an effective justice system works.

[–] rottingleaf -1 points 1 week ago (1 children)

That's also how communist attempts at building a better civilization work. They can't avoid that base of their ideology where one human classification of reality takes precedence over life. So they have plans. Plan for steel production, plan for grain production, plan for convicted criminals.

You are falling behind the plan? Have to arrest someone. Someone. There's a weird teen walking there, let's tie him to a battery and beat him till he signs a paper saying he stole some shit.

The plan is overshot? Won't bother if there's a gang-rape with murder before the police station with some policemen participating.

What I don't understand is why people want to do that again, just with clueless (not possessing the necessary information) planners replaced with clueless (for the same reason) machines.

Even USSR's problems with planning were mostly not due to insufficient computational resources (people from today think those were miserable, but let's please remember that they were programmed by better and more qualified people that most of today's programmers), but due to power balance in hierarchy meaning that planning was bent for the wishes of power. In other words, plans were made for what people on important posts wanted to see, and didn't account for what people on other important posts didn't want to share. Just like it's going to be with any system. Tech doesn't solve power balance by itself.

[–] [email protected] 2 points 1 week ago (2 children)

So they are using it to try and decide on deployment?

If that is all they're using it for I guess it isn't too bad. As long as it isn't accusing individuals of planning to commit a crime with a zero evidence.

[–] spankmonkey 6 points 1 week ago* (last edited 1 week ago) (1 children)

It is probably going to be used to justify the disproportionate police attention paid to minority communities and to justify activities similar to stop and frisk.

[–] [email protected] 2 points 1 week ago

The thing is don't they already have crime stats? Presumably they're already using them as justification so this won't change much.

[–] rottingleaf 2 points 1 week ago

No, it's bad, because ultimately it's not leading anywhere, such tools can't be used by unqualified people not understanding how they work (not many qualified people do too, my teamlead at work, for example, is enthusiastic and just doesn't seem to hear arguments against, at least those I can make with my ADHD, that is, avoiding detailed explanations to the bone).

If ultimately it's not applicable where people want to apply it, it shouldn't even be tested.

This is giving such applications credibility.

It's the slippery slope that some people think doesn't exist. Actually they exist everywhere.