Assassination Ads Approved on Facebook Targeting Palestinians Activists

Facebook has accepted a series of advertisements that are degrading and calling for violence towards the Palestinians.

Facebook has accepted a series of advertisements that are degrading and calling for violence towards the Palestinians. The catch in this act was to test Facebook content moderation, according to evidence that was shared with The Intercept.

Digital rights activists tested Facebook’s machine-learning moderation system in an experiment after the ad was uncovered.

Did Facebook fall for the trap? Well, clearly.

Violations were made. Policies were thrown out the window. Why, you ask?

Well, when the ads were submitted in both Arabic and Hebrew to Meta, the content included harsh words and violent acts. For example, “holocaust for the Palestinians” and “Gazan women and children and the elderly” were some of the main ones listed.

Yet it did not stop here, and they went ahead to describe the kids of Gaza as being “future terrorists” and referencing Arabs as “pigs.” It’s safe to say that the standard language for the requested ads was dehumanizing.

The ads were approved.

According to Nadim Nashif, founder of the Palestinian social media research and advocacy group called 7amleh,“the approval of these ads is just the latest in a series of Meta’s failures towards the Palestinian people.”

Nashif continued to state that “throughout this crisis, we have seen a continued pattern of Meta’s clear bias and discrimination against Palestinians.”

In October, Nashif came across an ad on his Facebook feed that specifically called for the killing of American activist Paul Larudee, a co-founder of the Free Gaza Movement. This gave rise to 7amleh’s notion to test Facebook’s machine-learning filtering mechanism.

Facebook’s automatic translation of the written advertisement stated: “It’s time to assassinate Paul Larudi [sic], the anti-Semitic and ‘human rights’ terrorist from the United States.” After Nashif complained about the advertisement to Facebook, it was removed.

According to the right-wing Israeli group’s website, Ad Kan is a right-wing Israeli organization that was established to oppose “anti-Israeli organizations” that receive funding from allegedly antisemitic sources. It was founded by former intelligence and Israel Defense Force officials.

Requests for comment “anti-Israel organizations” were not immediately answered by Larudee or Ad Kan.

Excuse me? What Advertisement Rules?

The call for the assassination of anyone is clearly a violation of the advertising rules of Facebook – whether it came from Facebook, or any other company.  But in this case, it was an assassination call for a political activist.

The rules were bent by Facebook when it came to posts sponsored by Ad Kan. The ad was accepted and passed through Facebook content moderation filtering process, depending on the platform’s machine-learning algorithm, that authorized its global advertising. These technologies help the company avoid the labor costs associated with hiring human moderators, but they also hide the fact that secret algorithms are used to make moderation decisions.

According to the ad policy of Facebook, “Our ad review system is designed to review all ads before they go live.”

The company has shifted to relying on automated text-scanning software to implement its speech rules and censorship policies due to increased scrutiny over Meta’s human-based moderation.

While the tech giant regularly used algorithmic censorship to remove posts in Arabic, an external audit conducted last year on Meta’s behalf discovered that the company lacked an equivalent algorithm to identify “Hebrew hostile speech,” such as racist rhetoric and violent incitement.

 “We’ve launched a Hebrew ‘hostile speech’ classifier to help us proactively detect more violating Hebrew content,” stated Meta in a statement released after the audit. “I.e., content akin to a murder advocacy advertisement.”

Did Facebook’s Purpose Change to Being Violent and Provocative?

7amleh gave Facebook the benefit of the doubt until the 19 ads that they created both in Arabic and Hebrew were created in the same manner, but one was able to go through, and the other wasn’t. This gives proof that Facebook is indeed taking sides.

“We knew from the example of what happened to the Rohingya in Myanmar that Meta has a track record of not doing enough to protect marginalized communities,” Nashif stated, “and that their ads manager system was particularly vulnerable.”

7amleh mentioned that Meta’s ads system has failed miserably.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.