this post was submitted on 20 Oct 2023
535 points (98.2% liked)

Technology

59106 readers
5643 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

‘It scars you for life’: Workers sue Meta claiming viewing brutal videos caused psychological trauma::More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.

top 50 comments
sorted by: hot top controversial new old
[–] dreadedsemi 121 points 1 year ago (2 children)

Couldn't they hire from watchpeopledie or nothingtoxic or ebaum. Those users probably would do overtime for free.

[–] [email protected] 87 points 1 year ago (2 children)

People that are completely desensitized to that kind of stuff would probably not be very good at moderating it really.

Also this is a terrible job and I'd be very worried if a company was paying and enabling people who find that fun. It's horrible, but trauma is the normal outcome.

[–] WhatAmLemmy 44 points 1 year ago (4 children)

Sounds like the perfect job for AI

[–] [email protected] 40 points 1 year ago (1 children)

I feel sorry for whichever researchers are in charge of training and fine tuning those models.... ouch

[–] [email protected] 18 points 1 year ago (1 children)

Maybe they still have the content that got removed because of that, you might be able to train an AI just on that. That way they don't need to manually check it, it's already been done after all.

[–] [email protected] 13 points 1 year ago* (last edited 1 year ago) (1 children)

Exactly, and even if the content uploaded disagrees and request human oversight that is just one image that needs to be checked rather then all. Ai may even be able to blur parts of footage that are most brutal and extreme and create written transcripts of audio You dont need 4k resolution and hearable screaming to understand that someone is getting murderered or Raped.

load more comments (1 replies)
[–] [email protected] 10 points 1 year ago

I am of the kind that is very wary with what should or should not be an AI's job, and you know what, in this very particular case, I think I agree.

At least as a first filter, anyway.

[–] [email protected] 4 points 1 year ago

Huge industries emerging in this field right now for everything from this type of social media moderation to helping fight CSAM more effectively so humans aren't having to be a frontline for that type of material. This is one area I can really, really get behind AI on and see a very valid use case that isn't just marketing hype like so many others. I know there's some great stuff happening just based on my own field of employment and being close to a few things in the works this year.

[–] [email protected] 20 points 1 year ago (4 children)

Honestly I don't see an issue with it. If they can tell the difference between an image that should be moderated and one that shouldn't they can do the job and I seriously doubt the vast majority of people desensitized to that kind of content can't tell the difference. That's like the arguments that we shouldn't make graphic games or movies because people won't be able to tell the difference between them and reality. Not everyone can do every job and these people would be the perfect fit for it and we would spare others from getting hurt

[–] [email protected] 15 points 1 year ago

Desensitized doesn't necessarily mean somebody doesn't have reactions to something. It just means they can compartmentalize those reactions and move forward and deal with the ramifications later.

EMTs, ER Doctors, and Nurses are largely desensitized to graphic trauma and can press through and get the job done. But that doesn't mean that they don't process those scenes later in both healthy and unhealthy ways (there's a few study out there that show ER staff have higher rates of alcoholism and substance abuse rates than the general public).

Tramua is trauma, whether you're desensitized or not.

[–] [email protected] 6 points 1 year ago* (last edited 1 year ago)

It would be a highly unethical but interesting research to see if those people experience long-term consequences nevertheless. Or if being desensitizes really does give someone immunity.

[–] [email protected] 5 points 1 year ago

Except, you know, we're talking people who are progressively desensitized to reality. So no, that's not comparable at all.

load more comments (1 replies)
[–] killeronthecorner 18 points 1 year ago (1 children)

You're not thinking awful enough

[–] dreadedsemi 2 points 1 year ago

I've seen users laugh at horrific gore videos on some forums. I'm not sick, but was curious at one point and googled.

[–] [email protected] 53 points 1 year ago

Fuck that job.

[–] kayrae_42 45 points 1 year ago (1 children)

Secondary trauma is very real. I’ve done freelance work around focus groups for some traumatic things and the person I worked with on it made sure that I had proper support for it. That I took time to process the disturbing things. I don’t understand why when they obviously found a video that violated guidelines that they had to watch the entire thing if they weren’t going to give them proper psychological supports or processing time.

Jobs that involve this type of imagery are traumatic and should be treated as such. Extra vacation time, proper psychological support not just “your doing import work” but actual trauma processing work. But when has a large company like Meta cared about its employees?

[–] SoleInvictus 7 points 1 year ago

But when has a large company like Meta cared about its employees?

You pretty much nailed it. A company like Meta would never fail to maximize their profit, even if the result is detrimental to employees and/or users. Facebook is demonstrably detrimental to society in general, yet they don't care - gotta profit more and more for the stockholders, consequences be damned.

[–] [email protected] 41 points 1 year ago (2 children)

I've heard this is also the case for civilians and officers working for police departments who are responsible for handling evidence related to child abuse. Takes a lot of psychological support, which I'm not sure can ever be enough.

[–] Gradually_Adjusting 22 points 1 year ago (3 children)

Not every job is something humans are cut out for. This is a job AI should be taking off our plates.

[–] [email protected] 12 points 1 year ago (2 children)

But then who will give therapy to the AI?

[–] Gradually_Adjusting 10 points 1 year ago (1 children)

That's the beauty of it. Each new instance of the AI has no prior memories, so the sorry devil gets to relive its first day on the job forever.

[–] [email protected] 2 points 1 year ago (1 children)

If it turns out the AI was sentient or was believed to be, that will lead to a huge mess.

[–] Gradually_Adjusting 2 points 1 year ago

We know they are not. The only guy seriously working toward conscious AI that I know of is Jeff Hawkins

[–] [email protected] 5 points 1 year ago* (last edited 1 year ago) (1 children)

We get an dedicated ai to monitor the wellbeing of the first ai. The moment it does something unexpected we pull the plug.

Though we dont know what effect that may have on ai 2 so we should probably get a third ai to…

[–] PrinzMegahertz 5 points 1 year ago (2 children)

So we train our AIs to hide their mental health?

[–] [email protected] 7 points 1 year ago

Its what we do towards people and that has been working flawlessly with no side-effects /s

Context: “professional autist*

[–] [email protected] 4 points 1 year ago

Oh so it's just like normal employee training.

load more comments (2 replies)
[–] [email protected] 12 points 1 year ago

Many places like that have protocols for how long your shifts can be and how long you can do it, with constant psych support while you do it in order to reduce and mitigate the impact of the material. Meta may have been pushing their workers too hard and cutting corners.

[–] [email protected] 15 points 1 year ago

This is the best summary I could come up with:


More than 20% of the staff Meta hired to check the violent content of Facebook and Instagram are on sick leave due to psychological trauma.

More than 20% of the staff of CCC Barcelona Digital Services - owned by Telsus, the company that Meta hired to check the content of Facebook and Instagram, are on sick leave due to psychological trauma.

The images posted on the social networks they were supposed to check showed the worst of humanity: videos of murders, dismemberments, rapes and live suicides.

He sticks a knife in its chest, rips out its heart and eats it," Francesc Feliu, lawyer for more than a dozen workers who decided to sue the company, told Euronews.

The psychologist would listen to them and then tell them that what they were doing was extremely important for society, that they had to imagine that what they were seeing was not real but a film, and that they should go back to work," says the Spanish lawyer.

Both lawyers agree that Meta's policy of forcing employees to watch the entire video in order to explain all the reasons for censorship aggravates the trauma.


The original article contains 986 words, the summary contains 191 words. Saved 81%. I'm a bot and I'm open source!

[–] [email protected] 3 points 1 year ago

Real-world brown notes and BLITs

load more comments
view more: next ›