Facebook defends content policies after leaked documents stir controversy
With close to 2 billion monthly active users, Facebook Inc. has found itself in the tough position of policing content posted by more than a quarter of the world’s population. Over the weekend, leaked documents published by The Guardian revealed just how difficult it is for moderators to decide what is and is not allowed on the social network.
Today, Facebook responded to criticisms of its decision making process in a blog post by Monika Bickert, Facebook’s head of global policy management, who explained that choosing what content should be removed is not as simple as it seems.
“Designing policies that both keep people safe and enable them to share freely means understanding emerging social issues and the way they manifest themselves online, and being able to respond quickly to millions of reports a week from people all over the world,” Bickert said.
“For our reviewers, there is another hurdle: understanding context. It’s hard to judge the intent behind one post, or the risk implied in another. Someone posts a graphic video of a terrorist attack. Will it inspire people to emulate the violence, or speak out against it? Someone posts a joke about suicide. Are they just being themselves, or is it a cry for help?”
According to Bickert, most of the posts that come up for review “aren’t the easy ones: they are often in a grey area where people disagree.”
Bickert added that content that may be acceptable in one country might be unacceptable in another. For example, criticizing the monarchy might be perfectly fine in the U.K., but it could result in jail time elsewhere. Facebook itself must also comply with local laws regarding content, with countries such as Germany threatening the social media giant with hefty fines if it does not remove “clearly criminal” content quickly enough.
Responding to the document leak, Bickert said that Facebook does not publish all of its policies because it does not want “to encourage people to find workarounds.” She also said that despite Facebook’s efforts to be clearly define allowable content, its moderators still “sometimes end up making the wrong call.”
Facebook has received criticism for censoring some content, such as when it removed an iconic Vietnam War photo of a naked child running in the street. At the same time, the company has been criticized for not removing other content quickly enough, such as multiple instance of violence that have been live streamed through Facebook Live.
While some have criticized Facebook for seemingly being inconsistent in its moderation, Bickert said that the company sees equally opposing criticisms “as a useful signal that we are not leaning too far in any one direction.”
In recent years, Facebook started employing artificial intelligence to streamline its moderation process, and Chief Executive Mark Zuckerberg announced earlier this year that the company will be hiring an additional 3,000 moderators to tackle the social network’s content reviews.
Photo: Facebook
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU