The Facebook files: death, abortion and hate

Status update: Six real examples from the leaked guidelines used to train Facebook moderators.

The Guardian has published a huge investigation into how Facebook polices its content. The leaked files reveal the complex, often bizarre rules about what users are “allowed” to post.

Facebook is facing fresh scrutiny over the way it moderates content on its platform after a newspaper investigation revealed guidelines for staff on how to handle posts featuring issues such as violence, hate speech, child abuse and pornography.

According to Facebook’s moderator guidelines, which were leaked to The Guardian and published on Sunday night, it is okay to post threats of violence to the platform as long as they are not too specific.

People often do so in “facetious and unserious ways”, the rulebook explains.

But if the threat could “credibly cause real world harm”, it must be deleted. Ditto if it is directed at a head of state or a “vulnerable” group of people — which it lists as foreigners, Zionists or homeless people.

Another set of guidelines explains that images of child abuse can be published as long as they are not sexual or “celebratory”.

Videos of violent deaths can be shown if they “create awareness”. Self-harming is allowed to be live streamed so as not to “censor or punish people in distress”.

The list goes on. “Handmade” art can show nudity, but not “digitally made art”. Abortions can be shown, but not if they involve nudity. Sexual references are allowed in a “humorous context”, but cannot be too explicit.

The Guardian says that it has seen “more than 100” of these training manuals, which have been written for Facebook’s 4,500 moderators.

They include a vast range of disturbing and controversial issues, from match-fixing to cannibalism.

The newspaper said that the site’s moderators are often “overwhelmed” by the number of reports they must police on the site. It quoted one source who complained that “Facebook cannot keep control of its content. It has grown too big, too quickly.”

The “Facebook Files”, as the reports have been dubbed, are just the latest controversy to hit the social network. It is now approaching two billion users. Those users watch more than 100 million hours of video on the site every day.

Anti-social network

Facebook has been widely condemned for the contents of its training manuals. “They need to throw them away and start again,” said Claire Lilley, head of child safety online at the NSPCC. The rules are very confusing, often offensive, and may even put people at risk. It must do better.

Have some sympathy, urge others. Working as a moderator must be a gruelling, harrowing job. The company has a lot more to learn — but it has never claimed otherwise, and it has been forced to make things up as it goes. Nothing like Facebook has ever existed before, connecting so many people so instantly. It will take time to get it right.

You Decide

  1. Read the six statements which are allowed or not allowed on Facebook in this story’s image. Do you agree with the decision?
  2. Should critics have more sympathy for Facebook?

Activities

  1. Class debate: Facebook should not censor any content.
  2. Write your own guidelines for what moderators should and should not delete from Facebook.

Some People Say...

“Facebook is the most important invention of the 21st century.”

What do you think?

Q & A

What do we know?
The scale of Facebook’s task is huge. One of the leaked documents showed that moderators received almost 51,300 “revenge porn” complaints in January this year alone. Facebook currently employs 4,500 moderators working in 40 languages, and recently said it will hire 3,000 more. Once hired they receive two weeks of training, plus “more than 100” sets of guidelines. This is the first time that those guidelines have been revealed to the public.
What do we not know?
We do not know who leaked the files to The Guardian, or whether Facebook will change its rules in response to the criticism it has received over the last two days. We also do not know how strict the rules are, or whether any of them are “out-of-date” (Facebook says that some of them are, but not which ones).

Word Watch

Zionists
People who believe in creating and supporting a Jewish nation. In the modern era, this refers to Israel.
Child abuse
Facebook claims that this is because the images may help the child to be identified. However, this approach has been criticised by child protection experts.
Self-harming
Facebook says that in cases of live attempted suicide, it will try to alert agencies who can help. However, it will delete the video once “there’s no longer an opportunity to help.”
4,500 moderators
Facebook recently announced that it would hire 3,000 more moderators. They work all around the world, in many different languages. Some content — like child sexual abuse and terrorism — is automatically picked up by algorithms. Most other problems must be reported by users. Moderators can then decide whether to ignore, delete, or “escalate” the content (by sending it to their manager).
100 million hours
According to statistics released by Facebook in January 2016.
Two hours
In the case of the murder in Thailand, the two videos remained on Facebook for 24 hours, and were viewed over 300,000 times.

PDF Download

Please click on "Print view" at the top of the page to see a print friendly version of the article.