Meta Facilitates the Death of Palestinians Using Lavendar

In Palestine, more accurately the West Bank and Gaza Strip, the Israelis stripped the name's beauty, and they named their system Lavender AI.

In Palestine, more accurately the West Bank and Gaza Strip, three indigenous Lavandula or what we call Lavendar plants are grown. Just like anything associated with Palestine, the Israelis stripped the name’s beauty, and they named their system Lavender AI.

Survival of the Fittest WhatsApp Groups

Meta is once again under the spotlight for sharing data from various WhatsApp groups in Palestine to train Israeli military AI.

WhatsApp, Meta’s renowned messaging platform, is used by tens of millions of people daily.

In Palestine, specifically in Gaza, numerous users reside to the use of WhatsApp, but unlike us, their purpose is not to share jokes or make plans. They create various groups to share information and plan their next moves of survival due to Israeli military attacks, which has led to the loss of over 34,000 lives, mostly civilians, foreign aid workers, and journalists.

Various WhatsApp groups are being created by the Palestinians, focused on sourcing meals due to the blockade’s impact, leaving 2.3 million facing famine. On another note, paramedics and doctors created groups on WhatsApp to connect, striving to obtain essential supplies like oxygen cylinders, potentially making them targets for Israeli airstrikes.

Lavendar in Palestine

Lavender, an advanced AI system utilized by Israel to pinpoint targets in Gaza, likely underwent training using information extracted from similar WhatsApp groups, according to Paul Biggar, a software engineer, and the founder of ‘Tech for Palestine.’

Lavendar is programmed to target members of Hamas, but civilians are being targeted if they are on the same WhatsApp group.

“It is also important to note that the Lavender system is a misapplication of AI,” he mentioned TRT World.

Although Lavender indicates individuals could share common WhatsApp groups, it doesn’t imply their affiliation with Hamas or participation in any violent activities.

“It is a “pre-crime” system and should never be used without (a) thorough investigation of all suggested targets,” Biggar stated.

He elaborates that the targets are “rubber stamped,” directly suggesting Israel’s involvement in the civilian targeting.

“Each target named is a de facto civilian until proven otherwise – Lavender does not prove anything. This is just an attempt at “ethics washing” – using an AI system to accomplish a desired but immoral or illegal goal, and then blaming the AI,” Biggar says.

This month, +972 Magazine revealed Israel’s utilization of Lavender to target tens of thousands of Palestinians, systematically singling them out, with “limited human oversight and a tolerant stance on casualties.”

Meta What Are You Doing?

It all started when Meta imposed restrictions on Instagram regarding discussions or posting about the ongoing war in Palestine since October 7th. Now, aided by WhatsApp, Lavender AI can access information on Palestinian civilians and track them, potentially leading to their death. If this doesn’t reveal Meta’s stance, what does?

The main question is, should we boycott Meta, just as we have done with other entities?


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.