this post was submitted on 03 Dec 2024
696 points (98.7% liked)

Microblog Memes

5985 readers
3293 users here now

A place to share screenshots of Microblog posts, whether from Mastodon, tumblr, ~~Twitter~~ X, KBin, Threads or elsewhere.

Created as an evolution of White People Twitter and other tweet-capture subreddits.

Rules:

  1. Please put at least one word relevant to the post in the post title.
  2. Be nice.
  3. No advertising, brand promotion or guerilla marketing.
  4. Posters are encouraged to link to the toot or tweet etc in the description of posts.

Related communities:

founded 1 year ago
MODERATORS
 
all 49 comments
sorted by: hot top controversial new old
[–] [email protected] 100 points 1 week ago (3 children)

10% false positives is an enormous rate given how Police like to ‘find’ evidence and ‘elicit’ confessions.

[–] spankmonkey 40 points 1 week ago (2 children)

It isn't predicting individual crimes, just pattern recognition and extrapolation like how the weather is predicted.

"There are on average 4 shootings in November in this general area so there probably will be 4 again this year." is the kind of prediction that AI is making.

[–] Death_Equity 17 points 1 week ago (1 children)

Predictions with hallucinations is exactly how an effective justice system works.

[–] rottingleaf -1 points 1 week ago (1 children)

That's also how communist attempts at building a better civilization work. They can't avoid that base of their ideology where one human classification of reality takes precedence over life. So they have plans. Plan for steel production, plan for grain production, plan for convicted criminals.

You are falling behind the plan? Have to arrest someone. Someone. There's a weird teen walking there, let's tie him to a battery and beat him till he signs a paper saying he stole some shit.

The plan is overshot? Won't bother if there's a gang-rape with murder before the police station with some policemen participating.

What I don't understand is why people want to do that again, just with clueless (not possessing the necessary information) planners replaced with clueless (for the same reason) machines.

Even USSR's problems with planning were mostly not due to insufficient computational resources (people from today think those were miserable, but let's please remember that they were programmed by better and more qualified people that most of today's programmers), but due to power balance in hierarchy meaning that planning was bent for the wishes of power. In other words, plans were made for what people on important posts wanted to see, and didn't account for what people on other important posts didn't want to share. Just like it's going to be with any system. Tech doesn't solve power balance by itself.

[–] [email protected] 2 points 1 week ago (2 children)

So they are using it to try and decide on deployment?

If that is all they're using it for I guess it isn't too bad. As long as it isn't accusing individuals of planning to commit a crime with a zero evidence.

[–] spankmonkey 6 points 1 week ago* (last edited 1 week ago) (1 children)

It is probably going to be used to justify the disproportionate police attention paid to minority communities and to justify activities similar to stop and frisk.

[–] [email protected] 2 points 1 week ago

The thing is don't they already have crime stats? Presumably they're already using them as justification so this won't change much.

[–] rottingleaf 2 points 1 week ago

No, it's bad, because ultimately it's not leading anywhere, such tools can't be used by unqualified people not understanding how they work (not many qualified people do too, my teamlead at work, for example, is enthusiastic and just doesn't seem to hear arguments against, at least those I can make with my ADHD, that is, avoiding detailed explanations to the bone).

If ultimately it's not applicable where people want to apply it, it shouldn't even be tested.

This is giving such applications credibility.

It's the slippery slope that some people think doesn't exist. Actually they exist everywhere.

[–] [email protected] 14 points 1 week ago

lol .... if AI is wrong .... we'll make it right

[–] [email protected] 3 points 1 week ago

That's in a punitive system. Used in a transformative/preventive manner (which it will not), this can actually save lives and help people in need

[–] TehBamski 33 points 1 week ago (1 children)

Yeah, yeah. This is nothing. This is just a Minor(ity) Report.

[–] disguy_ovahea 8 points 1 week ago (2 children)

Yup. About as exciting as triplets in a hot tub.

[–] Brickhead92 7 points 1 week ago

But what if it was some kind of hot tub Time machine... Turns to look at camera

[–] [email protected] 3 points 1 week ago

The butcher, the baker, the candlestick-maker... of course, they're triplets, it all fits!

[–] Pilferjinx 27 points 1 week ago (2 children)

Israel and China implement sophisticated algorithms to suppress Palestinians and Uyghurs with severe effectiveness. Don't take this tech lightly.

[–] [email protected] 27 points 1 week ago

If it were my choice I'd have it banned. "90%" accuracy? So 10/100 predictions result in an innocent person getting surveiled for literally no reason? Absolutely the fuck not.

[–] [email protected] 6 points 1 week ago* (last edited 1 week ago)

Idk about china but Israel carpet bombs apartment buildings. You don't need precision ai for that

[–] psycho_driver 21 points 1 week ago

I'm pretty sure currrent techBrocracy will implement it such:

if (isBlack) { willCrime = True; }

[–] [email protected] 12 points 1 week ago (1 children)

Do they have skull measurements in their dataset? It's predestined to reproduce and cement existing biases.

[–] [email protected] 4 points 1 week ago

Hey now, phrenology is a well documented scientific field

[–] Intergalactic 10 points 1 week ago (3 children)

Generative AI ≠ actual AI.

[–] [email protected] 6 points 1 week ago

AI is AI. not all AI is AGI, but stable diffusion, LLMs and all the other ones are real AI. the only reason people disagree is because they watched too much sci-fi and think that AI is supposed to be sentient or whatever. hell, even the code controlling the creepers in Minecraft is called AI. in the game. you can spawn a creeper with the noai flag and it'll make it so the creeper doesn't do anything. quite a silly take to say it's not ai just because you don't like it. there's many things to dislike about the modern state of AI, your argument is just shooting yourself in the foot.

[–] [email protected] 2 points 1 week ago

Yeah! Real AI is expert systems and fuzzy logic! Generative AI's capabilities and intelligence fail in comparison to Fuji Electric's advanced temperature control systems!

[–] [email protected] 0 points 1 week ago (1 children)
[–] A_Union_of_Kobolds -2 points 1 week ago (2 children)

Because language models aren't sentient?

[–] [email protected] 5 points 1 week ago

Is that a requirement of AI?

[–] [email protected] 3 points 1 week ago

Nobody but science fiction writers has ever said AI is sentient. You watch too many movies.

[–] [email protected] 9 points 1 week ago (3 children)

I have seen it before, and I liked it. “Person of Interest” on CBS. Though in real life, I’m not a fan.

[–] [email protected] 4 points 1 week ago

You are being watched.

[–] TheBat 3 points 1 week ago

There are too many Samaritans already but no one has made The Machine.

[–] dgmib 7 points 1 week ago

If crime is that predictable, that would mean crime is isn’t caused by people’s choices but something else… like say mental illness, poverty, hunger, lack of social supports and that lots of cops locking up people in prisons as a deterrent won’t work to reduce crime… hmmm wait a second…

[–] [email protected] 4 points 1 week ago

I mean I can also create situations where I can 100% predict crime, like an old school protection racket.

[–] [email protected] 3 points 1 week ago

Hey could you just scan me before I continue putting effort in earning money?

[–] stupidcasey 2 points 1 week ago

As we can see from this advanced simulation, the perpetrator had 13 fingers you are the only person who has 13 fingers the evidence is obvious.

Mr. Thirteen Fingers, I simply do not understand how an innocent man like yourself can take a dark turn and suddenly commit over 300 crimes scattered throughout every country across the globe, you had every reason not to commit them but you did it anyway, how do you plead?

Would it matter if I said not guilty?

[–] Zorque 1 points 1 week ago

I thought the hand thing was (mostly) a thing of the past.

[–] Shardikprime 1 points 1 week ago (2 children)

I mean you can train ai to look for really early signs of multiple diseases. It can predict the future, sort of

[–] Tayb 9 points 1 week ago (2 children)

Didn't one ai have a lot of false positives because it would say any picture of skin with a ruler in it was cancer? The moment it saw a ruler it responded with cancer because all the data it was fed about confirmed cancers had rulers in them.

[–] Shardikprime 1 points 1 week ago (2 children)

Never heard about that one care to share the soup

[–] [email protected] 1 points 1 week ago (1 children)

AI is just a digital dumbass that every tech bro decided needs to be part of every part of your life from now on.

[–] Tayb 2 points 1 week ago

It's an extremely powerful pattern recognition tool. There are uses for that, but your right that people are going the "if you only have a hammer, everything looks like a nail" route.

[–] [email protected] 6 points 1 week ago

Detecting symptoms amd signs of a thing is not predicting the future.

That's like seeing a car that isn't going to stop, so you slow down before you might T-bone them. That's not really "predicting the future" but just paying attention and calculating likelihoods.