- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
- cross-posted to:
- [email protected]
- [email protected]
- [email protected]
A part of the job, nonetheless. I can only hope they were provided counselling during their time working at meta. If not, this would be a major oversight by the company. There are certain people for this job… Not one for everyone, obviously.
To help with context here on the article “More than 140 moderators have been diagnosed with severe post-traumatic stress disorder caused by exposure to graphic social media content including murders, suicides, child sexual abuse and terrorism”
I couldn’t be a social media moderator; seeing those kinds of graphic images would probably make me go insane or commit suicide. It’s awful that these people have to do this job and Facebook et al should be giving them as much support as they need. Most days I think we should shut down all social media.
It’s been said before, but this is the sort of job that it would be great if AI could take over from humans. Nobody should have to see such horrific things.
Excellent point. Hopefully it will as it seems like its in the scope of AI capability.
deleted by creator
Wasn’t intended to be that. Good point though, thanks for the feedback
Oh hey, look, yet again we see how workers are treated when there are no regulations to protect them and an employer can risk their health for profit.