DHH+ Your Authors @dhh Creator of Ruby on Rails, Founder & CTO at Basecamp, NYT best-selling author, and Le Mans 24h class-winning racing driver. Jun. 19, 2019 1 min read + Your Authors

"My son died here", said the father of one of Facebook's content moderators. What an absolutely deplorable working environment 😞  https://www.theverge.com/2019/6/19/18681845/facebook-moderator-interviews-video-trauma-ptsd-cognizant-tampa 

Maybe time to admit that the mission of "connecting the world" has been an abject failure. That the world is worse off from being connected in the ways Facebook has done it.

Facebook pays moderators $28,800/year to be exposed to the most vile content in the world. Content Facebook's fundamental model is the distributor of. It's beyond broken.

To cope with the absolutely terror of moderating for Facebook, employees are allotted nine minutes of "wellness time" per day. Nine minutes. Not ten. Nine. This company and the contractors it hires is an abomination.

If the stories of what Facebook's content moderators must endure were of prisoners being forced to watch that shit for 8 hours/day, we'd rightly decry such abuse as torturous. At Facebook, it's just business.

Even reading this story of the working conditions for Facebook moderators feels like a kind of psychological abuse. I cannot even imagine actually having to watch this stuff. It's so fucking disturbing.

If the byproduct of your operations is this insanely toxic psychological waste, then your operations are simply not safe for human operations, and should be shut down. Just as if you were pouring lead into the local pond.

First conceived as a personal call to action, I'm starting to believe that #DeleteFacebook should be taken more literal. Delete all of Facebook. Not individual accounts, but the whole fucking thing.

Note how all of Facebook's responses in the piece are based around the core assumption that this work simply must happen. Rather than challenging the assumption that anyone should be exposed to this, and that a system that requires such exposure is fundamental defunct.

Instead we get odes to maybe some day helping moderators who've suffered lasting mental distress with “Of course we should do that,” he said, "[but] It’s really, really hard to pull it off in a legally compliant way". That's when the "legality" of helping people matters.

Finally, kudos to @CaseyNewton and @verge for reporting this story. Tech journalism has come an awful long way from those awfully cheerleading days of the 00s.

You can follow @dhh.


Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Since you’re here...

... we’re asking visitors like you to make a contribution to support this independent project. In these uncertain times, access to information is vital. Threader gets 1,000,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Your financial support will help two developers to keep working on this app. Everyone’s contribution, big or small, is so valuable. Support Threader by becoming premium or by donating on PayPal. Thank you.