Whatever you think of Facebook's stance on the Pelosi video, it's good to see them putting the relevant decision-maker in front of a real journalist to explain it. Whether out of fear, arrogance, or calculated self-interest, this is something big tech companies almost never do.
Facebook's position on this is a tough one to defend—not to say it's wrong, necessarily, it's just very nuanced—and Cooper did an excellent job picking it apart. That said, Bickert explained it clearly, and the result was an informative segment.
The truth that Bickert couldn't say is this: Facebook is still queasy about making editorial judgments (understandably), so it has tried to carve out some of the less controversial ones, like terrorism and fake accounts, while abstaining from those that are politically charged.
The Pelosi video is a tough one for them, because it's in the broad *class* of content that they consider too thorny to touch (misinformation about domestic politics), yet this *particular* video is not actually thorny—we all know it's fake. Cooper rightly pushed on that point.
This is a good example of why the hoary "tech platform vs. media company" debate is still relevant. A tech platform wants clear policies that it can apply consistently to everyone, whereas a media company makes calls case-by-case.
FB has realized the platform defense doesn't really fly anymore, but it's not remotely prepared to start adjudicating the truth of every post. Its approach to the Pelosi video—limit algorithmic reach, but don't take it down—is its compromise solution. And Cooper wasn't having it.
If I had to defend Facebook's policy, here's what I'd say: The proper locus of Facebook's editorial function is its algorithm—deciding which posts to amplify—as opposed to its content moderation process, which is about enforcing ground rules. "Free speech, but not free reach."
Yes, we know the Pelosi video is fake, and we acknowledge that leaving it up probably does damage that isn't fully mitigated by limiting its reach or posting fact-checks alongside it. But in the long run, appointing Facebook the political speech police would be worse.
But Bickert didn't say that, probably because it calls attention to the scary power that Facebook wields—a sensitive topic in a time or rising antitrust sentiment. So she went with the excuse that FB wants to let users decide for themselves what to believe.
That's a pretty slick talking point, as long as you don't think too hard about it. But Cooper was well-prepped, and exposed it as incoherent with Bickert's previous admission that the video was clearly misleading.
Bickert fell back on the distinction FB now makes between "dangerous" misinformation and plain old misinformation—which is not a crazy place to draw a line, if you're determined to draw one. But FB wants to pretend the line is clear, when it's actually fuzzy and subjective.
By asking what Facebook would do if a doctored video showed the commander-in-chief appearing drunk and unfit, Cooper illustrated how arbitrary that "dangerous misinformation" provision can seem. Bickert sensed a trap and tried to dodge, but the point was made.
The end result did not make Facebook look great, though it could have been worse. That said, I think it's important to give FB credit for showing up: It's a form of accountability that has been sorely lacking to this point, both from Facebook and the other big platforms.
You can follow @WillOremus.
Tip: mention @threader_app on a Twitter thread with the keyword “compile” to get a link to it.
Enjoy Threader? Sign up.
Threader is an independent project created by only two developers. The site gets 500,000+ visits a month and our iOS Twitter client was featured as an App of the Day by Apple. Running this space is expensive and time consuming. If you find Threader useful, please consider supporting us to make it a sustainable project.