You are using an older browser version. Please use a supported version for the best MSN experience.

‘13 Reasons Why’ Posts Get Extra Attention From Facebook Moderators to Prevent Suicides, Leak Shows

Variety logo Variety 5/22/2017 Janko Roettgers
© Provided by Variety

A massive leak of content moderation guidelines shows that Facebook is often walking a fine line between free speech and the welfare of its users. Case in point: Facebook’s moderators have been advised to give any reported posts that make mention of the controversial Netflix show “13 Reasons Why” some extra attention to prevent suicides, according to an article in The Guardian.

The British newspaper got its hands on a treasure trove of internal documents meant to instruct content moderators on how to deal with anything from physical threats of violence to revenge porn and the depiction of animal abuse. Some of the documents specifically deal with self-harm and suicide, and the company apparently saw it necessary to update its guidelines after Netflix premiered “13 Reasons Why” in March.

The show tells the story of a teenager who takes her own life — and Facebook was apparently so concerned that it might inspire copycats that it advised all of its moderators to immediately escalate any content related to the show to senior managers, according to The Guardian.

It’s worth noting that this doesn’t mean that Facebook puts anyone who ever posted anything about the show on suicide watch. Facebook moderators only come into play once other Facebook users have flagged a post for review. According to the leaked documents, moderators have been reviewing around 10,400 posts about self-harm during a four-week period of this year.

A very small percentage of these posts lead to Facebook contacting law enforcement. Last year, the site saw 4531 such reports during a two-week period, and alerted law enforcement in 63 of those instances.

Facebook has frequently been criticized for not doing enough to moderate the content posted to its site, criticism that boiled over when video of a homicide was posted to the site by the perpetrator last month. As a response, Facebook CEO Mark Zuckerberg admitted that the service had to do more, and announced the hire of 3000 additional moderators, bringing the total number of staffers reviewing flagged content to 7500.

“Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech,” a spokesperson told Variety via email. “This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”

The leaked documents show that the services is frequently siding with free speech. For example, in some cases, Facebook opts to keep threats of violence on the site if they’re not found credible, or simply aspirational. Moderators are also advised not to delete media depicting animal cruelty and non-sexual abuse of children, unless the post in question condones those acts.

Facebook is also walking a fine line on self harm, and advising moderators to not take down live streams of users that talk about or threaten self harm. The reasoning, as expressed in one of the guidelines: “We don’t want to censor or punish people in distress who are attempting suicide. Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers.”

Subscribe to Variety Newsletters and Email Alerts!

AdChoices
AdChoices

More from Variety

AdChoices
image beaconimage beaconimage beacon