Ahead of elections, Facebook opens up a ‘war room’ to fight fake news
Facebook Inc. gave reporters a tour of its “war room” Wednesday, explaining its latest efforts to tackle election interference.
“As our teams have gotten smarter, so have the adversaries seeking to misuse our services,” Samidh Chakrabarti, Facebook’s director of product management and civic engagement, wrote in a post Thursday. With the Brazil and U.S. elections drawing near, Facebook decided it was time to dedicate one room to address misuse of the platform.
More than two dozen employees will be working in this room (pictured), supported by an additional 20,000 employees across the company working on safety and security. On any given day, Facebook said, the war room might be home to staff working on legal issues, threat intelligence, data science, software engineering, research and community operations.
“When everyone is in the same place, the teams can make decisions more quickly, reacting immediately to any threats identified by our systems, which can reduce the spread of potentially harmful content,” said Chakrabarti.
These people will be constantly monitoring election issues, such as influxes of spam, violation of policies, content from foreign bad actors, as well as efforts to prevent people from voting.
Facebook will also be keeping an eye on election news coverage in traditional media in an effort to understand what’s actually happening in politics. For instance, Facebook said it had caught a post beginning to go viral which declared that Brazil’s Election Day had been moved from Oct. 7 to Oct. 8 because of national protests. That was in fact untrue.
In another example, Facebook said it found a spike in hate speech in Brazil following the first rounds of election results. That speech, said Facebook, was designed to foment violence against people from Northeast Brazil. Thanks to the war room, that was nipped in the bud.
“Our machine learning and artificial intelligence technology is now able to block or disable fake accounts more effectively – the root cause of so many issues,” said Chakrabarti. “We’ve increased transparency and accountability in our advertising. And we continue to make progress in fighting false news and misinformation.”
Chakrabarti also admitted that adversaries are continually improving in what he called an “arms race” of information. That much seems true, according to a New York Times report Wednesday, which said Brazilian voters using Facebook-owned WhatsApp were still being bombarded by fake and misleading political content.
In its effort to curb the spread of misinformation in the U.S., Facebook has also been somewhat trigger-happy. As USA Today reported Wednesday, many benign ads were taken off Facebook just because they contained the words “African-American,” “Latino,” “Hispanic,” “Mexican,” “women” and “LGBT.”
It’s also likely, even Facebook conceded, that as with past and current content moderation efforts, the company won’t get this effort right all the time either. Rob Leathern, Facebook’s director of product management, said “there are going to be mistakes” in an operation of such massive scale.
And this being Facebook, some observers were skeptical whether much here was new, with others called it little more than a charm initiative.
Photo: Facebook
A message from John Furrier, co-founder of SiliconANGLE:
Your vote of support is important to us and it helps us keep the content FREE.
One click below supports our mission to provide free, deep, and relevant content.
Join our community on YouTube
Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.
THANK YOU