this post was submitted on 28 Jan 2024
302 points (99.0% liked)

Privacy

29778 readers
913 users here now

A place to discuss privacy and freedom in the digital world.

Privacy has become a very important issue in modern society, with companies and governments constantly abusing their power, more and more people are waking up to the importance of digital privacy.

In this community everyone is welcome to post links and discuss topics related to privacy.

Some Rules

Related communities

Chat rooms

much thanks to @gary_host_laptop for the logo design :)

founded 4 years ago
MODERATORS
 

For facial recognition experts and privacy advocates, the East Bay detective’s request, while dystopian, was also entirely predictable. It emphasizes the ways that, without oversight, law enforcement is able to mix and match technologies in unintended ways, using untested algorithms to single out suspects based on unknowable criteria.

you are viewing a single comment's thread
view the rest of the comments
[–] [email protected] 108 points 5 months ago* (last edited 5 months ago) (1 children)

Cops only like technology when they can abuse it to avoid having to do real investigative police work.

They don't care to understand the technology in any deep manner, and as we've seen with body cams, when they retain full control over the technology, it's basically a farce to believe it could be used to control their behavior.

I mean, on top of that, a lot of "forensic science" isn't science at all and is arguably a joke.

Cops like using the veneer of science and technology to act like they're doing "serious jobs" but in reality they're just a bunch of thugs trying to dominate and control.

In other words, this is just the beginning, don't expect them to stop doing stuff like this, and further, expect them to start producing "research" that "justifies" these "investigation" methods and see them added to the pile of bullshit that is "fOrEnSiC sCiEnCE."

[–] [email protected] 11 points 5 months ago

TBH: Tech companies are not much different from how you described cops.

They don't usually bother to learn the tech they are using properly and take all the shortcuts possible. You see this by the current spout of AI startups. Sure, LLMs work pretty good. But most other applications of AI is more like: "LOL, no idea how to solve the problem. I hooked it up to this blackbox, which i don't understand, and trained it to give me the results i want."