UPDATED 22:43 EST / MARCH 21 2019

APPS

Following New Zealand terror attack, Facebook admits it has a problem

Facebook Inc. said it’s reviewing its livestreaming policy after the terrorist attack at a mosque in Christchurch, New Zealand, that left 50 people dead was streamed on Facebook itself.

The social media giant has said the attacker’s livestream was not taken down fast enough, adding that its artificial intelligence systems failed to flag the video at first. The company said its system should flag videos containing suicidal or harmful acts, but more needs to be done.

“AI has made massive progress over the years and in many areas, which has enabled us to proactively detect the vast majority of the content we remove,” Guy Rosen, Facebook’s vice president of product marketing, said Wednesday in a blog post. He added, unnecessarily: “But it’s not perfect.”

The video was seen live for about two minutes by fewer than 200 people before it was taken down following the New Zealand police contacting Facebook, according to the company. No one reported it while viewing it.

Facebook said in total the video was seen about 4,000 times, with the first report coming in 29 minutes after the video was posted. This was 12 minutes after the end of the livestream. Facebook said in the next 24 hours it removed 1.2 million videos of the attack at upload, meaning they weren’t viewed. A further 300,000 copies of the video were removed after being uploaded.

The company responded to people who have asked why such content can’t immediately be taken down, saying it is working on improving its detection systems.

“AI systems are based on ‘training data,’ which means you need many thousands of examples of content in order to train a system that can detect certain types of text, imagery or video,” said Facebook. “This approach has worked very well for areas such as nudity, terrorist propaganda and also graphic violence where there is a large number of examples we can use to train our systems.”

The company said another problem is training the algorithm to see the difference between innocuous content, such as some violent gaming videos, and the real thing. The company also said that in the case of the Christchurch shooting, there was a “community of bad actors” working on editing the video so it would not be detected by the algorithm.

Image: Descrier/Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU