UPDATED 21:49 EST / SEPTEMBER 17 2019

POLICY

Facebook cracks down on extremism and terrorist organizations

Facebook Inc. has made a number of changes to combat extremists and terrorism on its platform, the company announced Tuesday.

In a blog post today, Facebook said it has initiated a number of updates throughout the year that haven’t been widely discussed, and this week other updates have been implemented. These changes fall under the “Dangerous Individuals and Organizations policy.”

The move comes a day before Facebook, as well as Google LLC and Twitter Inc., will stand in front of a Senate commerce committee and discuss how they prevent violent content from appearing on their platforms.

The first change will be to identify a terrorist organization correctly, because as Facebook says, there is “no globally recognized and accepted definition of terrorist organizations.” The new definition will focus on behavior and not ideology, said the company.

“But while our previous definition focused on acts of violence intended to achieve a political or ideological aim, our new definition more clearly delineates that attempts at violence, particularly when directed toward civilians with the intent to coerce and intimidate, also qualify,” said Facebook.

Detection techniques have also been updated. Until now, Facebook has been using machine learning technology to identify language that that might be associated with terrorism. Over the last two years, the company said, its automated system has led to the removal of 26 million pieces of content related to global terrorist groups such as ISIS and al-Qaeda.

But that has now been expanded to identify language that could be related to other terrorist organizations and hate groups. Using human review and its artificial intelligence, Facebook said, more than 200 white supremacist groups have been culled from the platform.

There are still problems, though, in detecting content that needs to be removed quickly. Facebook admitted that the video of the attack in Christchurch that appeared on the platform wasn’t taken down fast enough — something the company was lambasted for.

Right now, Facebook is working with U.S. and U.K. law enforcement to obtain camera footage from their firearms training programs. The company hopes to get to a point where real violent content relating to extremism can be differentiated from fictional violent content.

The company also said it has expanded its team to crack down on extremist content, and it will include 350 people who have experience in law enforcement, counterterrorism and national security.

“Previously, the team was solely focused on counterterrorism — identifying a wide range of organizations including white supremacists, separatists and Islamist extremist jihadists as terrorists,” said the company. “Now, the team leads our efforts against all people and organizations that proclaim or are engaged in violence leading to real-world harm.”

Image: Pixabay

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU