Until just a few days ago, only Facebook staff and insiders knew the details of the company's controversial moderation policies.
Now, whether Facebook likes it or not, those guidelines have been thrust into the spotlight by the Guardian, which leaked internal documents advising moderators on how to handle troubling content of all stripes. That includes one of the most sensitive kinds of content: livestreamed self-harm or suicide.
In general, criticism of its moderator guidelines focus on the fact that Facebook has little business incentive to remove offensive content because perceived censorship makes its platform less attractive to users. That conflict of interest is a valid concern when it comes to hate speech. It doesn't work the same way, however, when applied to the problem of people livestreaming suicidal thoughts or behavior. Read more...
More about Facebook, Social Good, Mental Health, Facebook Live, and Suicide Preventionvia Zero Tech Blog