UPDATED 23:39 EDT / MAY 21 2017

APPS

Leaked ‘Facebook Files’ show how a social media giant struggles to moderate content

A report released on Sunday by The Guardian reveals a lot of detail about how Facebook Inc. tries to keep graphic content off its platform, and not surprisingly it’s a Herculean task.

What the investigation unearthed in a series of reports regarding Facebook’s moderation protocols is likely to become a matter of debate and consternation. For instance, the leaked documents show that Facebook will allow the live-streaming of “self-harm” content, but only because blocking it could constitute ignoring a call for help. The documents obtained by the newspaper reveal that thousands of such videos appear on Facebook weekly.

The internal manuals cover just about every case of the myriad ethical obligations Facebook has in keeping the platform safe. The moderating guidelines cover pornography, graphic violence, hate-speech, racism, terrorism, child abuse, sports game-fixing and even cannibalism.

The report illustrates the difficulty and complexity of moderating content coming from Facebook’s 2 billion users. While the pressure is on Facebook and other social media sites to crackdown on content deemed as harmful, a source told The Guardian, “Facebook cannot keep control of its content. It has grown too big, too quickly.”

Certain content may be automatically flagged, but moderators are faced with the uncompromising job of dictating what is harmful or not. One manual on “credible violence” states users cannot say, “Someone shoot Trump,” because he’s the head of state. But while that comment might be jocular in nature, the threat to “snap a bitch’s neck” is somehow given the OK.

capture

Facebook acknowledges that censoring all forms of violent language, especially between friends, would be denying people from airing frustrations. “From this perspective language such as ‘I’m going to kill you’ or ‘Fuck off and die’ is not credible and is a violent expression of dislike and frustration,” one of the leaked manuals states.

In cases in which videos show actual deaths, such as traffic accidents, Facebook says such videos can raise awareness, although this kind of content needs to be flagged as “disturbing” and should be “hidden from minors.” Facebook has been called out as being hypocritical at times regarding its moderation policy, not least when it blocked the iconic Vietnam War photo of a naked girl running through the street, an image that could be said to have raised awareness of the brutality of war.

Another difficult area is when content is deemed nasty, such as animal abuse or nonsexual child abuse, but by sharing the content perpetrators could be identified or the victims helped. Facebook has brought to the public’s attention various cases of police brutality, and while such videos are disturbing, they may be helpful in exposing social problems. The site was criticized last year when it took down posts of Philando Castile being shot by a police officer. The post was eventually allowed back up after Facebook admitted the deletion was a mistake.

The site has also been criticized for not taking violent content down fast enough, notably the video of a father killing his child in Thailand and the murder of a 74-year old man in Cleveland this year. The consensus from both free speech advocates and critics asking for more regulation is that Facebook is faced with what appears to be an almost impossible task in fairly and evenly moderating content from much of the world’s population.

Image: Alan Levine via Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU