UPDATED 23:16 EDT / APRIL 10 2019

APPS

Facebook announces sweeping changes to tackle misinformation

In an almost 2,000-word post published Tuesday, Facebook Inc. revealed a slew of changes that it hopes will reduce the proliferation of dubious news on the platform.

The social media company has made the changes under the strategy of “remove, reduce, and inform.” It said “problematic content” is not only a concern when elections are around the corner but a concern year-round.

“This involves removing content that violates our policies, reducing the spread of problematic content that does not violate our policies and informing people with additional information so they can choose what to click, read or share,” wrote Guy Rosen, Facebook’s vice president of integrity, and Tessa Lyons, head of news feed integrity, respectively.

The strategy is one Facebook has been pushing for a while now, but the company has just honed its efforts somewhat.

Regarding “Remove,” Facebook said its embracing transparency, updating people on what content gets taken down in its community standards page. There will also be a “Group Quality” feature from now on, intended to inform users as to how to create content that won’t get flagged.

To reduce specious content, Facebook is calling in more experts and third-party fact-checkers as well as pruning the reach of groups that have been accused of sharing such content. It’s likely people will see fewer of those posts anyway because Facebook will now implement a “Click-Gap” feature into its news feed algorithm that will limit how far those posts travel.

As for informing, Facebook said content in English and Spanish that appears on the news feed will come with “trust indicators.” More information will be added to the “Quality” tab, while Facebook’s “Verified Badge” will now be seen on Messenger as well as on the main platform. Also in Messenger, people will be able to block certain content or have the opportunity to hit a “Context Button” to get more information about a post.

“Misinformation is a complex and evolving problem, and we have much more work to do,” Henry Silverman, Facebook’s Operations Specialist, wrote in a separate post. “With more than a billion things posted to Facebook each day, we need to find additional ways to expand our capacity.” He went on to say that with this much content, there are simply not enough fact-checkers in the world, while fact-checking itself can be difficult given the fact that truth is sometimes not clear.

“Over the next few months, we’re going to build on the explorations we’ve started around this idea, consulting a wide range of academics, fact-checking experts, journalists, survey researchers and civil society organizations to understand the benefits and risks of ideas like this,” said Silverman.

Image: Facebook

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU