UPDATED 02:24 EDT / OCTOBER 09 2017

APPS

Facebook’s chief security officer takes to Twitter to defend company over ‘the algorithm’

Facebook Inc. Chief Security Officer Alex Stamos unleashed a series of tweets Saturday defending his employer regarding allegations Facebook hasn’t done enough to combat the spread of fake news.

“I am seeing a ton of coverage of our recent issues driven by stereotypes of our employees and attacks against fantasy, strawman tech cos,” wrote Stamos (pictured). The issue is Facebook’s algorithm, and the criticism is that it is flawed.

Stamos’ quite candid outburst seemed to have been spurred on by tweets from Washington Post contributor and Lawfare Associated Editor Quinta Jurecic condemning Facebook’s algorithm, which she described as being “designed poorly and irresponsibly and which could have been designed better.”

Stamos’ lengthy riposte began: “I appreciate Quinta’s work (especially on Rational Security) but this thread demonstrates a real gap between academics/journalists and SV [Silicon Valley].” He goes on to say that the issue is not that simple, and people underestimate the complexity of filtering a platform in which billions of posts appear.

“Nobody is not aware of the risks,” wrote Stamos, but also added that there are risks involved with training an algorithm to spot the “truth.”

“Lots of journalists have celebrated academics who have made wild claims of how easy it is to spot fake news and propaganda. Without considering the downside of training ML [machine learning] systems to classify something as fake based upon ideologically biased training data,” Stamos wrote. The downside, he wrote, could be something akin to the Orwellian “Ministry of Truth.”

Stamos went to say in a roundabout way that critics should be careful what they wish for. To endorse free speech but to ask for more moderation, he said, is a contradiction. He ended the stream of tweets with what seemed like a dash of levity. “A lot of people aren’t thinking hard about the world they are asking SV to build,” he wrote. “When the gods wish to punish us they answer our prayers. Anyway, just a Saturday morning thought on how we can better discuss this. Off to Home Depot.”

If there has been any criticism of Stamos’ remarks, it is that a possible solution is just more human eyes on content and less trust in a technological moderator. Facebook has already said it would add thousands of moderators after Russian-backed misinformation ads appeared on the platform.

Image: Web Summit via Flickr

A message from John Furrier, co-founder of SiliconANGLE:

Support our open free content by sharing and engaging with our content and community.

Join theCUBE Alumni Trust Network

Where Technology Leaders Connect, Share Intelligence & Create Opportunities

11.4k+  
CUBE Alumni Network
C-level and Technical
Domain Experts
15M+ 
theCUBE
Viewers
Connect with 11,413+ industry leaders from our network of tech and business leaders forming a unique trusted network effect.

SiliconANGLE Media is a recognized leader in digital media innovation serving innovative audiences and brands, bringing together cutting-edge technology, influential content, strategic insights and real-time audience engagement. As the parent company of SiliconANGLE, theCUBE Network, theCUBE Research, CUBE365, theCUBE AI and theCUBE SuperStudios — such as those established in Silicon Valley and the New York Stock Exchange (NYSE) — SiliconANGLE Media operates at the intersection of media, technology, and AI. .

Founded by tech visionaries John Furrier and Dave Vellante, SiliconANGLE Media has built a powerful ecosystem of industry-leading digital media brands, with a reach of 15+ million elite tech professionals. The company’s new, proprietary theCUBE AI Video cloud is breaking ground in audience interaction, leveraging theCUBEai.com neural network to help technology companies make data-driven decisions and stay at the forefront of industry conversations.