UPDATED 20:21 EDT / NOVEMBER 13 2019

APPS

Instagram joins Facebook’s latest Community Standards Enforcement Report

Facebook Inc. released its Community Standards Enforcement Report Wednesday, revealing that the company is improving at proactively removing disturbing content from both Facebook and Instagram.

Among the wide variety of content were child nudity, child sexual exploitation and terrorist propaganda, as well as illicit firearms, drug sales, suicide and self-injury. The issue of misinformation circulating on Instagram wasn’t part of the report, even though that has been an issue.

On Facebook alone, 11.6 million pieces of content relating to child nudity and child sexual exploitation were taken down. On Instagram 754,000 pieces of similarly transgressive content were taken down.

In a post, Guy Rosen, Facebook’s vice president of integrity, said spotting and removing this kind of content on both apps has vastly improved. On Facebook 99% of the removed content was “proactively detected,” meaning the rest was reported. On Instagram 94.6% of the removed content was proactively detected.

Some 2.5 million posts concerning suicide and self-harm were taken down from Facebook, while 845,000 pieces of similar content were removed from Instagram. Rosen said this was the first time that content has been included in the report.

“We remove content that depicts or encourages suicide or self-injury, including certain graphic imagery and real-time depictions that experts tell us might lead others to engage in similar behavior,” said Rosen. “We place a sensitivity screen over content that doesn’t violate our policies but that may be upsetting to some, including things like healed cuts or other nongraphic self-injury imagery in a context of recovery.”

When it comes to drug sales, a massive 4.4 million pieces content was purged from Facebook, with 1.5 million posts relating to drug sales was taken down from Instagram. About 2.3 million pieces of firearm-sales content were taken down from Facebook and 58,600 such posts were removed from Instagram. Rosen added that 133,300 posts relating to terrorist propaganda content were removed from Instagram as well.

Rosen didn’t say how many posts were removed for hate speech, but he said that the algorithm was getting better at detecting it.

“Our automated systems have been trained on hundreds of thousands, if not millions, of different examples of violating content and common attacks,” he said, adding that the detection systems proactive rate relating to hate speech has climbed to 80%. It was at 68% when Facebook issued the last report.

Photo: Marco Verch/Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Support our open free content by sharing and engaging with our content and community.

Join theCUBE Alumni Trust Network

Where Technology Leaders Connect, Share Intelligence & Create Opportunities

11.4k+  
CUBE Alumni Network
C-level and Technical
Domain Experts
15M+ 
theCUBE
Viewers
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.

SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.