this post was submitted on 08 Aug 2023
413 points (98.1% liked)

Technology

59708 readers
5367 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

Detroit woman sues city after being falsely arrested while pregnant due to facial recognition technology::A Detroit woman is suing the city and a police detective after she was falsely arrested because of facial recognition technology while she was eight months pregnant, according to court documents.

all 26 comments
sorted by: hot top controversial new old
[–] [email protected] 121 points 1 year ago (3 children)

According to a recent review, 100% of the people falsely arrested via facial recognition findings have been black.

The technology needs to be legally banned from law enforcement applications, because law enforcement is not making a good faith effort to use the technology.

[–] rockSlayer 44 points 1 year ago (1 children)

We should ban patrol automation software too. They utilize historical arrest data to help automatically create patrol routes. Guess which neighborhoods have a history of disproportionate policing.

[–] [email protected] 17 points 1 year ago (1 children)

The problems with the approaches that tend to get used should be the cause of absolute outrage. They’re ones that should get anyone laughed off of any college campus.

The problem is that they lend a semblance of scientific justification to confirm the biases of both police departments and many voters. Politicians look to statisticians and scientists to tell them why they’re right, not why they’re wrong.

That’s why it’s so important for these kinds of issues to make the front pages.

[–] brygphilomena 3 points 1 year ago

It's great how statistics can be used to basically support anything the author wants them to. Identifying initial biases in the data is super important just as verifying the statistics independently.

[–] lawrence 16 points 1 year ago (2 children)

100% !?

I think that facial recognition software is a bit biased.

[–] RojoSanIchiban 10 points 1 year ago (1 children)

Developer, here. Working as intended.

*issue resolved

[–] [email protected] 9 points 1 year ago

Works on my machine

[–] [email protected] 5 points 1 year ago

A similar thing has happened here in the Netherlands. Algorithms have been used to detect fraud, but had a discriminatory bias and accused thousands of parents of child benefits fraud. Those parents came in huge financial problems as they had to back back the allowances, many even got their children taken away and to this day haven't gotten them back.

The Third Rutte Cabinet did resign over this scandal, but many of those politicians came back at another position, including prime minister Rutte, because that's somehow allowed.

Wikipedia (English): https://en.m.wikipedia.org/wiki/Dutch_childcare_benefits_scandal

[–] [email protected] 31 points 1 year ago (1 children)

Facial recognition by law enforcement should be banned.

[–] [email protected] 24 points 1 year ago

This is the best summary I could come up with:


A Detroit woman is suing the city and a police detective after she was falsely arrested because of facial recognition technology while she was eight months pregnant, according to court documents.

Porcha Woodruff, 32, was getting her two children ready for school on the morning of Feb. 16 when six police officers showed up at her doorstep and presented her with an arrest warrant alleging robbery and carjacking.

"Ms. Woodruff later discovered that she was implicated as a suspect through a photo lineup shown to the victim of the robbery and carjacking, following an unreliable facial recognition match," court documents say.

When Oliver learned that a woman had returned the victim's phone to the gas station, she ran facial technology on the video, which identified her as Woodruff, the lawsuit alleges.

On the day Woodruff was arrested, she and her fiancé urged officers to check the warrant to confirm whether the woman who committed the crime was pregnant, which they refused to do, the lawsuit alleges.

The office confirmed that facial recognition prompted police to include the plaintiff's photo in a six-pack, or array of images of potential suspects in the warrant package.


I'm a bot and I'm open source!

[–] Alexstarfire 22 points 1 year ago (2 children)

I'm going to buck the trend here and say this is less about the facial recognition software. The police used an 8 year old photo even though they had something more recent available. Then the victim identifies the woman. The only thing the software did was put her in the lineup.

I'm very much against facial recognition, even if it's 100% accurate. It's because it will get abused. Just like any other tech that reduces privacy.

[–] [email protected] 14 points 1 year ago

Eyewitnesses are notoriously unreliable at picking people out of a lineup as well. But I can kind of understand how if two unreliable systems point to the same person, that could be seen as enough for an arrest. It shouldn't have taken nearly as long for her to be cleared of any charges, however.

[–] [email protected] 7 points 1 year ago

It’s sort of the “guns don’t kill people, people kill people” argument. It just gives a shitty cop cover to keep being shitty. The tools should be improved to eliminate that cover unless it’s far more accurate.

[–] TIEPilot 12 points 1 year ago* (last edited 1 year ago)

And all the facial recognition has failed on black people... Is this a Family Guy episode?

[–] [email protected] 6 points 1 year ago

Fuck yes. Go set some precedent.