UPDATED 00:30 EST / JULY 18 2018

APPS

Undercover investigation reveals Facebook isn’t moderating itself nearly as well as it should

An investigative documentary called “Inside Facebook: Secrets of the Social Network” has revealed that the site allows images of child abuse, hate speech, graphics violence and racist content to stay on the platform in some instances.

This information was discovered after an undercover journalist working for Channel 4’s Dispatches program in the U.K. went undercover and worked for CPL Resources, a moderation contractor based in Dublin.

The investigation  revealed that while many sensitive images and videos are flagged and reported for abuse, they remained on the platform. In some cases such videos showing violence might be flagged as “graphic content” by Facebook, but at times it seems horrifying content such as a man stamping on a child was seen as not violating Facebook’s terms and conditions. That clip was only marked as “disturbing content.”

Moreover, it was also found that Facebook’s promise to look at flagged content within 24 hours was not always followed, and in some cases that content was related to self-harm and possible suicide threats.

During the investigation, a fellow moderator told the investigator, “If you start censoring too much then people lose interest in the platform … It’s all about making money at the end of the day.”

In an interview with Dispatches, Richard Allan, Facebook’s vice president of public policy, said some of the content mentioned in the documentary certainly should have been taken down. However, he denied that Facebook allows that kind of content to continue circulating to keep people on the site as a way to keep them in front of ads.

In a lengthy reply to the criticisms, the company said that it was reviewing some of its moderating practices and retraining staff. Facebook admitted that “mistakes were clearly made,” but in some cases content remains on the platform to bring awareness to issues such as bullying or in some cases to help children who may be in some kind of trouble.

Nonetheless, Dispatches spoke with early Facebook investor Roger McNamee, who called some darker content the “crack cocaine of their product.”

“It’s the really extreme, really dangerous form of content that attracts the most highly engaged people on the platform,” McNamee said. “Facebook understood that it was desirable to have people spend more time on the site if you’re going to have an advertising-based business.”

On the other hand, moderators in Dublin complained that they could not keep with the sheer number of reports, which amount to about 7.000 a day. Facebook has admitted there is a problem with backlogs, and said it is in the process of increasing its moderating staff from 10,000 people to 20,000.

Image: back catalog via Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU