03 November 2021

LET IT BE THE WAY IT IS: The Impossibilty of Content at Scale

The Scale Of Content Moderation Is Unfathomable

from the it's-way-more-than-you-think dept

Sometimes it's difficult to get across to people "the scale" part when we talk about the impossibility of content moderation at scale. It's massive. And this is why whenever there's a content moderation decision that you dislike or that you disagree with, you have to realize that it's not personal. It wasn't done because someone doesn't like your politics. It wasn't done because of some crazy agenda. It was done because a combination of thousands of people around the globe and still sketchy artificial intelligence are making an insane number of decisions every day. And they just keep piling up and piling up and piling up.

Evelyn Douek recently gave a (virtual) talk at Stanford on The Administrative State of Content Moderation, which is worth watching in its entirety. However, right at the beginning of her talk, she presented some stats that highlight the scale of the decision making here. Based on publicly revealed transparency reports from these companies, in just the 30 minutes allotted for her talk, Facebook would take down 615,417 pieces of content, YouTube would take down 271,440 videos, channels, and comments, and TikTok would take down 18,870 videos. And, also, the Oversight Board would receive 48 petitions to review a Facebook takedown decision.

And, as she notes, that's only the take down decisions. It does not count the "leave up" decisions, which are also made quite frequently. Facebook is not targeting you personally. It is not Mark Zuckerberg sitting there saying "take this down." The company is taking down over a million pieces of content every freaking hour. It's going to make mistakes. And some of the decisions are ones that you're going to disagree with.

And, to put that in perspective, she notes that in its entire history, the US Supreme Court has decided a grand total of approximately 246 1st Amendment cases, or somewhere around one per year. And, of course, in those cases, it often involves years of debates, and arguments, and briefings, and multiple levels of appeals. And sometimes the Supreme Court still gets it totally wrong. Yet we expect Facebook -- making over a million decisions to take content down every hour -- to somehow magically get it all right?

Anyway, there's a lot more good stuff in the talk and I suggest you watch the whole thing to get a better understanding of the way content moderation actually works. It would be helpful for anyone who wants to opine on content moderation to not just understand what Douek is saying, but to really internalize it.

No comments:

Ukraine arrests high ranking traitor for 'passing top secret Special Forces info to Russia'

Ukraine has apprehended a suspected traitor within its military intelligence directorat e, accused of "passing details of clandestine o...