Facebook on moderation and community standards: 'we take our role extremely seriously'

multiple-facebook-logos

Recently leaked documents gave a unique insight into just what can be posted on Facebook, and what moderators are expected to censor. In response to this, the company's head of global policy management, Monika Bickert, has written a lengthy statement in which she tries to explain how Facebook choose what to censor.

While some of Facebook's policies are well known, this is not true of all of them, and this is not without reason. Bickert explains: "We don't always share the details of our policies, because we don’t want to encourage people to find workaround." She says that Facebook faces a difficult task in determining whether a post that is reported should be removed or not.

With over a billion people using Facebook every day, the site's moderators have a great deal of content to deal with, but it is only content that is reported that is investigated. But this still means that there are millions of reports to handle each week, and policies need to be devised to "both keep people safe and enable them to share freely."

There are a lot of gray areas to work around. "For our reviewers, there is another hurdle: understanding context. It’s hard to judge the intent behind one post, or the risk implied in another. Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it? Someone posts a joke about suicide. Are they just being themselves, or is it a cry for help? In the UK, being critical of the monarchy might be acceptable. In some parts of the world it will get you a jail sentence. Laws can provide guidance, but often what’s acceptable is more about norms and expectations. New ways to tell stories and share images can bring these tensions to the surface faster than ever."

Bickert says that Facebook's standards are constantly evolving:

Our standards change over time. We are in constant dialogue with experts and local organizations, on everything from child safety to terrorism to human rights.  Sometimes this means our policies can seem counterintuitive. As the Guardian reported, experts in self-harm advised us that it can be better to leave live videos of self-harm running so that people can be alerted to help, but to take them down afterwards to prevent copycats. When a girl in Georgia, USA, attempted suicide on Facebook Live two weeks ago, her friends were able to notify police, who managed to reach her in time.

We try hard to stay objective. The cases we review aren’t the easy ones: they are often in a gray area where people disagree. Art and pornography aren’t always easily distinguished, but we’ve found that digitally generated images of nudity are more likely to be pornographic than handmade ones, so our policy reflects that.

Interestingly, Facebook sees requests for more censorship and requests for less as a sign of success. It serves "as a useful signal that we are not leaning too far in any one direction."

Birkert's post is worth reading. It doesn’t really offer any solid content, but it does illustrate just how Facebook is affected by the criticism leveled at it.

Image credit: Ink Drop / Shutterstock

2 Responses to Facebook on moderation and community standards: 'we take our role extremely seriously'

© 1998-2024 BetaNews, Inc. All Rights Reserved. Privacy Policy - Cookie Policy.