Microsoft Claims It Didn’t Aid Israel in the Genocide. To What Extent Is It True?

Microsoft confirmed that provided AI tools and could services to Israel's Ministry of Defense (IMOD), declaring the Microsoft support Israel.

On May 15, Microsoft confirmed that provided AI tools and could services to Israel’s Ministry of Defense (IMOD), declaring the Microsoft support Israel amid claims its technology was used in attacks against the war on Palestine and Lebanon.

In a detailed blog post, Microsoft support issues toward Israel acknowledged to supplying IMOD with software, specialist services, Azure cloud infrastructure, and Azure AI tools with translation services on top. The company emphasized, however, that it found “no evidence” that its technologies were used to cause damage, following internal audits and an independent analysis

“Decisions regarding how to run are outside of our control,” the company explained, adding that “Microsoft has no visibility as to how customers use our software on their own servers or other hardware. Or how the government cloud supporting the IMOD is utilized, for that matter, by another provider.”

According to an investigation by The Associated Press, Israeli troops use of Microsoft’s Azure cloud service increased by nearly 200-fold following October 7 which sparked the ongoing war on Gaza and Lebanon.

These comments came after the Associated Press launched an investigation into accusation of the Israeli army’s use of Microsoft and OpenAI AI’s software to select targets amid its war on Gaza and Lebanon that began in 2023.

Does Microsoft support Palestine or Israel

While it denied weaponizing its Microsoft AI Israel capabilities, Microsoft admitted to offering “limited emergency support” after the October 7, specifically to help identify Israeli hostages although reality did not show that.

“We believe the company acted in line with its values… to save lives in a manner that respects the privacy and rights of civilians in Gaza,” the Azure-parent said.

Be that as it may, none other than the Big Tech’s giants presented another angle to the situation, as they remain divided to a point that Microsoft terminated four of its employees for being pro Palestine the first two employees in October 2023 Abdo Mohamed, Hossam Nasr. Internal protests escalated in April 2024 when two staff members, Ibtihal Aboussad, and Vaniya Agrawal were fired for disrupting Microsoft’s 50th anniversary celebration, claiming the company was an abettor to “AI-fueled warfare.”

“We’ve heard concerns from our employees and the public about media reports regarding Microsoft Azure in Israel andAI technologies being used by the Israeli military to target civilians or cause harm in Gaza,” the company said and continues to mention that they “take these concerns seriously.”

Microsoft clarified that its relationship with IMOD is “a standard commercial agreement” and that its AI Code of Conduct applies to all customers, highlighting that defense agencies typically use their own specialized software for surveillance and targeting, and Microsoft has not supplied such tools.

With over 3,000 Israeli workers engaged in projects on AI, cybersecurity, and healthcare technology, Microsoft is still highly committed to the region’s technology sector.

Overlooking the backlash that the company has received, Microsoft support Israel stood its ground and said, “We share the profound concern over the loss of civilian life in both Israel and Gaza and have supported humanitarian assistance in both places.”

Microsoft, Israel, Palestine

The controversy reflects a global debate over how to lead and control a powerful Microsoft Israel boycott technology movement, especially in conflict zones. As Microsoft and other tech firms market their services to governments and the military, the divide between innovation and accountability is still undefined.

Israel’s deployment of AI programs like “Lavender” and “Where’s Daddy?” in Gaza is an example. Lavender had identified up to 37,000 targets for bombing, in numerous instances with minimal human supervision, with significant civilian casualties. “Where’s Daddy?” was allegedly used to strike targets when they would be at home, placing families at higher risk.

In addition, Israel’s Unit 8200 developed a ChatGPT-like AI trained on intercepted Palestinian communications to speed up arrests and surveillance. This has moral ramifications for privacy and AI militarization. Let’s not forget the time Microsoft suspended Palestinian user accounts for calling their beloved ones on Skype. And Microsoft still has the audacity to convince people that it did not aid in the genocide?

To add to these Microsoft Gaza worries, an Israeli firm, Candiru, dealt in hacking tools capable of breaching Microsoft Windows, indicating vulnerabilities in widely used software. These events point to the need for immediate regulation of AI applications in military contexts to prevent abuse and protect civilian lives and also a regulation in the Microsoft support Israel that led the firm to make a blog statement to defend itself against a genocide.


Inside Telecom provides you with an extensive list of content covering all aspects of the tech industry. Keep an eye on our Tech sections to stay informed and up-to-date with our daily articles.