Facebook explains how it reviews content after critical NYT report

Facebook explains how it reviews content after critical NYT report

The NYT claims to have accessed 1,400 pages from rulebook used by moderators to monitor posts on Facebook and tackle issues of extremism, hate, etc in countries. However, the article goes on to say that at Facebook, highly complex issues are distilled into simple yes-or-no rules that leads to errors in moderating content properly.

For instance, moderators were being mistakenly told to take down comments critical of religion in India. The article cited another example from Myanmar where a prominent extremist group was allowed to stay on the platform for months due to some error in paperwork. This error was admitted by Facebook.

“The Facebook employees who meet to set the guidelines, mostly young engineers and lawyers, try to distill highly complex issues into simple yes-or-no rules,” the article reads. “Those moderators, at times relying on Google Translate, have mere seconds to recall countless rules and apply them to the hundreds of posts that dash across their screens each day,” it adds.

It points out that rules on what the site’s two billion users should be allowed to say are discussed by Facebook employees every Tuesday morning “over breakfast” and are circulated to over 7,500 moderators globally.

Facebook said in a blog post that its gathering “over breakfast” is in fact a global forum that is attended by “experts from around the world with deep knowledge of relevant laws, online safety, counter-terrorism, operations, public policy, communications, product, and diversity”. In addition to lawyers and engineers, the meeting that is held every two weeks also includes human rights experts.

The social media giant explained that it has close to 15,000 content reviewers around the world, who are “supplied with training and supporting resources” instead of relying on Google Translate. Facebook said it reviewers content in more than 50 languages.