Page cover

STOP Supporting Genocide

🟒 Smarter AI 🟒

πŸ”₯ NO Tech for Oppression.

No To War/Genocidechevron-right

πŸ”₯ Realize. It's VERY Serious!

Our Human Rights Statementchevron-right

Many major tech companies, including Microsoft and Google (Alphabet), continue to face significant scrutiny and accusations from human rights organizations, legal experts, and UN officials for their involvement in providing technology and services to the Israeli government and military during the ongoing conflict in Gaza.

These entities argue that the companies' technology is crucial in enabling Israel's military operations and its system of control over Palestinians, which some characterize as apartheid and genocide.

πŸ”₯ Major Platforms. Involved.

No To War/Genocidechevron-right

Actions have serious consequences.

Google Alphabet, along with Amazon AWS, are involved in Project Nimbus, a contract to provide cloud computing and AI services to the Israeli government and military. Critics allege that Project Nimbus supports Israeli military AI programs used for targeting individuals in Gaza. Google has faced internal dissent and protests and reportedly removed a public pledge against using its AI for weapons or surveillance. Google and Amazon have largely not addressed human rights concerns.

Meta Platforms (parent company of Facebook) has been accused by the UN, human rights organizations, and survivors of playing a "significant," even "determining," role in enabling genocide and ethnic violence against the Rohingya Muslim minority in Myanmar and the Tigrayans in Ethiopia. Meta has faced legal action for its perceived complicity. Role in the Rohingya Genocide. Multiple reports and investigations have established Meta's contribution to the atrocities committed by the Myanmar military (Tatmadaw) against the Rohingya, which the UN has classified as genocide.

  • Amplification of Hate Speech: Meta's algorithms amplified content that incited hatred and violence, even when it violated the company's own standards.

  • Inadequate Moderation: The company had insufficient Burmese-language content moderators to handle the influx of hate speech and misinformation spread by the military and ultra-nationalist Buddhist monks.

  • Willful Blindness/Negligence: Investigations by groups like Amnesty International concluded that Meta was aware of the problem for years before the 2017 crackdown but took "wholly inadequate" action.

  • Legal Action: Rohingya refugees filed a class-action lawsuit in the U.S. and the U.K., seeking over $150 billion in damages, arguing that Facebook's negligence was a "substantial cause" of the violence.

  • Company Admission: Meta has acknowledged it was "too slow to act" and did not do enough to prevent the platform from being used to "foment division and incite offline violence" in Myanmar.

Role in the Tigray Conflict and Gaza. Similar concerns have been raised regarding Meta's operations in other conflict zones:

  • Ethiopia: In Ethiopia, Meta failed to curb the spread of content advocating hatred and violence against Tigrayans, which contributed to severe offline violence.

  • Gaza: A recent report from September 2025 by 7amleh, a digital rights group, accuses Meta of "complicity in enabling... genocidal incitement in Hebrew" during the war in Gaza, while simultaneously restricting Palestinian content.

While Meta itself did not directly orchestrate the violence, expert consensus points to its business model, lack of accountability, and failure to moderate harmful content as significant factors in enabling state-sponsored persecution and genocide

Microsoft Azure provides Azure cloud and AI services to the Israeli military. Investigations have suggested the Israeli army used Microsoft's cloud for mass surveillance. An independent UN report identified Microsoft as potentially enabling Israel's actions. Microsoft denies its technology is used to target people and states internal investigations found no such evidence. In September 2025, Microsoft confirmed it stopped certain services to a unit within the Israel Ministry of Defense involved in surveillance. The company has also faced internal dissent. Legal and Ethical Context.

No To War/Genocidechevron-right

πŸ”₯ Action | BDS

A UN report listed these companies as potentially complicit in the "genocide" in Gaza and called for accountability. The BDS Movementarrow-up-right has targeted both Microsoft and Google/Amazon for boycotts.

♻️ Compliant. Tools.

πŸ”₯ NO to War πŸ”₯ NO to Genocide πŸ”₯ NO 3rd-Parties <Who Support Genocide>

⚑ Technical. Infrastructure ⚑

We build and operate AI systems for lawful, civilian, and ethical use. Our platform is designed with safeguards to prevent harm, including restrictions against military, surveillance, targeting, or repression-related applications. We do not support or participate in violence against civilians, collective punishment, or violations of international human rights and humanitarian law. Responsibility in technology lies not in the existence of tools, but in how they are governed and used; accordingly, we apply human-in-the-loop controls, abuse monitoring, and contractual prohibitions to ensure our systems are not used to cause harm. We remain committed to transparency, accountability, and the continuous evaluation of risk as part of responsible AI development.

Ask AI: Questionchevron-rightCompliant Subprocessorschevron-rightCompliant LLM Gatewaychevron-right

Last updated