Netflix's series 13 Reasons Why instantly became the most talked-about show of 2017. But there's been no shortage of controversy surrounding the series. Medical experts have expressed concerns that it may pose a danger to teenagers experiencing suicidal thoughts, while others have pointed out that it fails to address Hannah's underlying mental health issues.
The Guardian obtained a large number of internal documents from the popular social media site. The instructions cover how to handle a variety of issues, including physical threats, revenge porn, and depictions of animal abuse. But it's noteworthy that a specific TV series caused Facebook to update their guidelines.
According to the documents obtained by The Guardian, Facebook is concerned about potential "copycat" suicide attempts. And it's for good reason — suicide contagion has been shown to be a very real and serious issue. Facebook has advised its moderators to escalate all content related to 13 Reasons Why to senior managers.
This, of course, doesn't mean that any user who posts about the series will be put on suicide watch — but moderators will thoroughly review any posts about the show that have been flagged by concerned Facebook users.
“Keeping people on Facebook safe is the most important thing we do. We work hard to make Facebook as safe as possible while enabling free speech,” a spokesperson told Variety. “This requires a lot of thought into detailed and often difficult questions, and getting it right is something we take very seriously.”
It's important to note that the leaked documents show Facebook isn't attempting to infringe on freedom of speech. For example, moderators have been advised to not remove live streams of users who threaten self-harm.
“We don’t want to censor or punish people in distress who are attempting suicide. Experts have told us what’s best for these people’s safety is to let them livestream as long as they are engaging with viewers," according to Facebook's guidelines.