this post was submitted on 28 Aug 2023
85 points (93.8% liked)

Work Reform

9823 readers
905 users here now

A place to discuss positive changes that can make work more equitable, and to vent about current practices. We are NOT against work; we just want the fruits of our labor to be recognized better.

Our Philosophies:

Our Goals

founded 1 year ago
MODERATORS
 

Emotion recognition systems are finding growing use, from monitoring customer responses to ads to scanning for ‘distressed’ women in danger.

all 29 comments
sorted by: hot top controversial new old
[–] AllonzeeLV 40 points 1 year ago* (last edited 1 year ago) (1 children)

Too bad it gets the emotion and not the context.

I'd love to be fired because "I hate making money for these greedy ass capitalist douchebags" pops up on a screen whenever I come in.

The idea that employers should even be allowed to ask what their employees are feeling, much less scan them to discern it, is a new low for our modern Orwellian dystopia.

[–] [email protected] 14 points 1 year ago (2 children)

The thing is though, I don't see how someone like this could even work out.

Like, you hire employee 1, they get frustrated at something overnight. You fire them for being upset. Now you have to fill the seat. Employee 2 is brought on. They get told what happened to the person they replaced. They leave or are fired for having emotion and being human. This repeats ad nauseum.

[–] [email protected] 12 points 1 year ago (1 children)

Let's be real, most of us would get weeded out at the interview when they start spilling all the "we're like a family" bullshit.

[–] randon31415 2 points 1 year ago

What type of family? Found family? The kind of family that requires restraining orders for abuse? The kind that only sees each other on Chirstmas?

[–] AllonzeeLV 10 points 1 year ago* (last edited 1 year ago) (1 children)

I'm guessing it's going to be implemented as identifying "persistent negative attitudes" and as validation to fire anyone in non-fire-at-will locales.

It could also be used as bullshit to deny raises and promotions if your grateful or motivated indexes weren't high enough.

[–] FringeTheory999 11 points 1 year ago

so, basically a tool to suss out which employees have undisclosed mental health issues that the employer can’t legally ask about. cool. cool.

[–] [email protected] 18 points 1 year ago (1 children)

This needs to be shutdown. It's the most dystopian thing I have ever read.

[–] [email protected] 12 points 1 year ago* (last edited 1 year ago)

What's crazy is that this was already fully functional and in-use at least 8 years ago. Idk how this has stayed out of the headlines until now. Microsoft had a working demo of this in their visitor center in 2015 and was already using it in multiple places. As soon as you enter the room it assigns you a persistent ID, estimates your height, weight, eye color, hair color, and age. Then it tracks your mood and the overall mood of the room continuously. The ID can be persistent across any number of linked locations. They don't ask for anyone's permission before using it.

[–] [email protected] 12 points 1 year ago

I DON'T NEED TO BE/ACT HAPPY TO GET MY JOB DONE

[–] [email protected] 12 points 1 year ago

I hope they keep a lot of drive space for the depression folder.

[–] [email protected] 9 points 1 year ago* (last edited 1 year ago)

If they could do that, they would probably see how God damn miserable most people are. If they used that to change and make them not miserable, I don't see it being dangerous. But more than likely it will be more "your sadness doesn't vibe with us. You're fired."

[–] [email protected] 4 points 1 year ago

Great, now I have to walk around with permanent poker face.

[–] [email protected] 3 points 1 year ago* (last edited 1 year ago)

Everyday we get closer to a cyberpunk dystopia.