Casey Newton+ Your Authors @CaseyNewton I write The Interface. Get it by email: bit.ly/2yTbZcK | [email protected] | instagram + snap: @crumbler Feb. 25, 2019 2 min read + Your Authors

Today I want to tell you what it's like to be a content moderator for Facebook at its site in Phoenix, Arizona. It's a job that pays just $28,800 a year — but can have lasting mental health consequences for those who do it.  https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona 

Employees can be fired after making just a handful of errors a week, and those who remain live in fear of former colleagues returning to seek vengeance. One man I spoke with started bringing a gun to work to protect himself.

In stark contrast to the perks lavished on Facebook employees, team leaders micro-manage content moderators’ every bathroom break. Two Muslim employees were ordered to stop praying during their nine minutes per day of allotted “wellness time.”

Employees have been found having sex inside stairwells and a room reserved for lactating mothers, in what one employee describes as “trauma bonding.”

Moderators cope with seeing traumatic images and videos by telling dark jokes about committing suicide, then smoking weed during breaks to numb their emotions. Moderators are routinely high at work.

Employees have begun to embrace the fringe viewpoints of the videos and memes that they are supposed to moderate. The Phoenix site is home to a flat Earther and a Holocaust denier. A former employee told me that he no longer believes 9/11 was a terrorist attack.

Employees are developing PTSD-like symptoms after they leave the company, but are no longer eligible for any support from Facebook or Cognizant. "I'm fucked up," one moderator who now has PTSD and generalized anxiety disorder told me.

I also spoke with employees on the site who told me they like their jobs, despite its challenges, and feel safe and supported at work. Not everyone emerges from this work with lasting trauma.

But this call-center model — which is also used by Google, Twitter, and others — puts essential questions of speech and security in the hands of folks who are being paid as if they're doing customer service for Best Buy.

I hope you'll take the time to read my full report, and let me know what you think. I'll have more stories from my reporting on this subject all week in The Interface.  https://www.getrevue.co/profile/caseynewton 


You can follow @CaseyNewton.



Bookmark

____
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Since you’re here...

... we’re asking visitors like you to make a contribution to support this independent project. In these uncertain times, access to information is vital. Threader gets 1,000,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Your financial support will help two developers to keep working on this app. Everyone’s contribution, big or small, is so valuable. Support Threader by becoming premium or by donating on PayPal. Thank you.


Follow Threader