Matthew Ball @ballmatthew Venture investor, strategist, essayist, that guy on Twitter. Prev. Head of Strategy @AmazonStudios, ex-Otter Media, @MediaREDEF. 🇨🇦 Mar. 17, 2019 2 min read

1/ There is this incredibly unhelpful POV, led here by Guardian's Tech Editor, that the platforms are (A) entirely capable of instantly solving the distribution of unsavory content; (B) It would take only one person - one - to fix this

This is entirely disconnected from truth

2/ Per Facebook: "In the first 24 hours [after the NZ attack] we removed 1.5 million videos of the attack globally, of which over 1.2 million were blocked at upload"

3/ The company has over 10k moderators working on this. I've heard of 10x engineers. I've never heard of 10,000x content moderators. The math, even if you believe FB exaggerated, shows that manual hiring isn't the complete, scalable solution

4/ And this 1.5MM uploads excludes YouTube!

One common response is that person x would have stopped video y. Traffic is more fluid than that; without video y, video z would have been the viral one

5/ The emotional problem is that success (blocking videos) is diffuse, while the failures (viral sharing of the video) are concentrated

But also consider: 1.5MM uploads shows you there are even more people trying upload the video than FB could reasonably hire to stop them

6/ Just as the idea you just need one moderator querying away misses the point, so does the (sad, but wrong) assumption there are only a few people uploading a few of these videos

7/ Exacerbating this is the number of ethically, cultural and societally beneficial uses of related-content (or even parts of the video), such as YouTuber covering it, a clip of The Young Turks or a CBS News segment

FB doesn't want to preemptively delete content

8/ It may be reasonable to err against more deletions v fewer with this kind of content. But this has different consequences - not just for channel economics, education, and so on - but the platforms can't (+ lack the right) to indiscriminately block content, en mass, by accident

9/ What happened in NZ an intolerable tragedy. The sharing of this content exacerbates the tragedy, encourages copycats.

But making claims that "one person" can fix these uploads isn't true, riles people further and distracts from the solutions

10/ It's true these platforms are demonstrably terrible at this, years after it became a problem, and some issues (e.g. pedophile rings on YouTube) are stunningly developed. Also not sure how a NZ attack video hits several million before being taken down

11/ But these conversations require specifics around infrastructure, volume in v out, what brute force moderation can and can't do, what the challenges are.

It's not about one moderator and one video.

12/ We also have different goals by content with ultra-violence. Terrorists looking to promote their cause is very different from those trying to publicly release and distribute evidence government abuses, murders, etc.

E.g. Raqqa Is Being Slaughtered Silently, Arab Spring

13/ Requiring pre-moderation for release and/or verified accounts absolutely stifles (and enables the crackdown and persecution of) releases relating to state-level abuses and horrors. Algorithmic-filtering of violence/blood/screams/gunshots can't distinguish purpose or intent

14/ Context is also important

- Real gun shot sound in non-fiction v. fiction
- Real gun shot in acceptable non-fictional content v. unacceptable
- Real gun shot in unacceptable content editorially approved by CNN v. random uploader

Etc.


You can follow @ballmatthew.



Bookmark

____
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.

Enjoy Threader? Sign up.

Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.