Former Meta Engineer Sues Company Over Alleged Bias Against Palestinian Content

A former engineer at Meta Platforms, Inc. has filed a lawsuit against the social media giant, accusing it of bias in its handling of content related to the Gaza conflict and alleging wrongful termination. Ferras Hamad, a Palestinian-American engineer who worked on Meta’s machine learning team since 2021, filed the lawsuit in a California state court on Tuesday, claiming that Meta fired him for attempting to fix bugs that were causing the suppression of Palestinian Instagram posts.

meta

Hamad’s complaint includes accusations of discrimination and wrongful termination. He claims that Meta exhibited a pattern of bias against Palestinians, which included deleting internal communications where employees mentioned the deaths of their relatives in Gaza and investigating the use of the Palestinian flag emoji by employees. In contrast, Hamad asserts that no similar investigations were launched for employees using Israeli or Ukrainian flag emojis.

Meta has yet to respond to requests for comment on these allegations.

Hamad’s lawsuit echoes long-standing criticisms from human rights groups about Meta’s content moderation practices concerning Israel and the Palestinian territories. These criticisms were highlighted in an external investigation commissioned by Meta in 2021, which pointed to inconsistencies in how content related to these regions was managed on the platform.

The recent conflict in Gaza began on October 7, when Hamas militants attacked inside Israel, resulting in the deaths of 1,200 people and the taking of over 250 hostages, according to Israeli reports. In retaliation, Israel launched a significant offensive in Gaza, which Gaza health officials report has resulted in more than 36,000 deaths and a severe humanitarian crisis.

Amid this backdrop, Meta has faced accusations of suppressing pro-Palestinian expressions on its platforms. Earlier this year, nearly 200 Meta employees expressed their concerns in an open letter to CEO Mark Zuckerberg and other company leaders, highlighting the perceived bias.

meta

Hamad’s firing, according to his lawsuit, stemmed from an incident in December involving an emergency procedure known as a SEV or “site event,” designed to address severe problems with the company’s platforms. Hamad identified procedural irregularities in handling a SEV related to content restrictions on Palestinian Instagram accounts, which were preventing their posts from appearing in searches and feeds. One particular incident involved a video posted by Palestinian photojournalist Motaz Azaiza, which was misclassified as pornographic despite depicting a destroyed building in Gaza.

Hamad claims that he received mixed messages from other employees about whether he was authorized to address the SEV. Despite having worked on similar sensitive issues related to Israel, Gaza, and Ukraine before, his manager eventually confirmed in writing that handling the SEV was within his job scope. However, the situation took a turn when, in January, a Meta representative informed Hamad that he was under investigation. Subsequently, Hamad filed an internal discrimination complaint but was fired just days later.

Also Read: Google’s Gemini AI Receives Update With Exciting New Features 2024: Check Out Here

Meta allegedly told Hamad that his termination was due to a policy violation, which prohibits employees from working on accounts of people they know personally. The company claimed that Hamad had a personal connection to Azaiza, the photojournalist, a claim Hamad denies.

This lawsuit not only brings to light the internal challenges and potential biases within Meta’s content moderation practices but also underscores the broader issues faced by tech companies in moderating content impartially amidst geopolitical conflicts. The outcome of Hamad’s case could have significant implications for Meta and other social media platforms, which are increasingly scrutinized for their role in managing and moderating user content globally.

As the lawsuit progresses, it remains to be seen how Meta will address these allegations and whether this case will prompt changes in its content moderation policies and practices, particularly in conflict zones like Gaza.

Leave a Reply

Your email address will not be published. Required fields are marked *