UPDATED 22:53 EDT / SEPTEMBER 22 2022

POLICY

Meta showed bias when moderating Israel-Palestine conflict, says new report

An independent report commissioned by Meta Platforms Inc. shows that the company showed bias against Palestinians during the 2021 Israel-Palestine crisis.

The report, released today, said that when the social media giant was moderating its platform during the conflict between Israel and the militant Palestinian group Hamas, it unfairly removed content from Palestinians. That was a breach of their right to free expression. The report also said that Meta, then Facebook, had come down harder in its moderating policies on Arabic speakers compared to Hebrew speakers.

The report, conducted by the consulting firm Business for Social Responsibility, said what many people have thought for a long time: that Meta has not moderated fairly in this conflict-strewn part of the world. It seems the company had overforced its moderation when it came to Palestinians and underforced when it came to Israelis – or at least people using Hebrew or Arabic.

“The BSR report confirms Meta’s censorship has violated the #Palestinian right to freedom of expression among other human rights through its greater over-enforcement of Arabic content compared to Hebrew, which was largely under-moderated,” tweeted the Arab Center for the Advancement of Social Media.

More than 260 Palestinians were killed during the two-week conflict, including 66 children. More than 1,900 Palestinians were injured. At least 13 Israelis were killed, and 200 Israelis were injured. As shells rained down and buildings were partly destroyed, people took to social media to show what was happening. Many thousands of people were displaced in the chaos.

Meta at first was applauded for not deleting such content but was later questioned for removing content that showed devastation on the Palestinian side. It was every bit real, so Meta had to explain itself. The company blamed the algorithm but also said that was the result of “human error.” The BSR reported that Meta’s contractors had wrongly labeled such content as relating to terrorism.

But that wasn’t all. There were many instances when Meta seemed to favor content written in Hebrew, but again, the algorithm was blamed. The report stated that at no point were humans at Meta showing animus toward people in regard to race, ethnicity, language or religion. It went on to say that at the company, there are “employees representing different viewpoints, nationalities, races, ethnicities, and religions relevant to this conflict.”

The report highlighted where Meta has shown “good practice” but recommended that it change some of its policies where conflicts are concerned. Meta said it has accepted the recommendations and will act accordingly in the future.

“BSR’s report is a critically important step forward for us and our work on human rights,” said the company. “Global events are dynamic, and so the ways in which we address safety, security, and freedom of expression need to be dynamic too. Human rights assessments like these are an important way we can continue to improve our products, policies and processes.”

Photo: Dima Solomin/Unsplash

A message from John Furrier, co-founder of SiliconANGLE:

Your vote of support is important to us and it helps us keep the content FREE.

One click below supports our mission to provide free, deep, and relevant content.  

Join our community on YouTube

Join the community that includes more than 15,000 #CubeAlumni experts, including Amazon.com CEO Andy Jassy, Dell Technologies founder and CEO Michael Dell, Intel CEO Pat Gelsinger, and many more luminaries and experts.

“TheCUBE is an important partner to the industry. You guys really are a part of our events and we really appreciate you coming and I know people appreciate the content you create as well” – Andy Jassy

THANK YOU