CloseGeç

Reason-Vision-Virtue

  • Homepage
  • Columnists
  • Politics
  • Opinion
  • World
  • Economy
  • History
  • Analysis
  • Interview
  • EDITIONS
    • Türkçe
    • English
    • Francais
    • Deutsch
    • Español
    • العربية
World

Algorithms and AI have turned Gaza into a laboratory of death

Denouncing this machinery is a historic task. It is not merely a matter of solidarity with the Palestinian people, although that solidarity is urgent and non-negotiable. It is about resisting a world in which data are worth more than lives, in which technology serves colonialism, and in which genocide is presented as an “algorithmic decision”. Today it is Gaza. Tomorrow, any people who dare to resist.
February 25, 2026
Sayid Marcos Tenorio
  • Facebook
  • X
  • Linkedin
  • Email
  • Whatsapp
image_print

The revelations by +972 Magazine and Local Call have exposed the darkest core of the contemporary war in Gaza, in which genocide is carried out not only by bombs and missiles, but by data, algorithms and global digital platforms.

The Israeli artificial intelligence system known as Lavender has confirmed what the Palestinian resistance, Lebanon, and Iran have denounced for years: Technology as an organic part of the Zionist war machine, functioning as an instrument of surveillance, target selection, and mass extermination.

The liberal rhetoric of “digital privacy” collapses in the face of the facts. Applications such as WhatsApp insist on the promise of end-to-end encryption, but conceal what is essential, in which metadata are worth more than messages.

“Location, contact networks, patterns of communication, and group affiliations make it possible to map the social life of an entire people. In Gaza, these data have been incorporated into military systems that turn human relationships into algorithmic criteria for death.”

Lavender assessed virtually the entire population of the Gaza Strip, comprising more than 2.3 million people, assigning automated “risk scores”. Merely being in a WhatsApp group, maintaining frequent contact with someone already marked, or displaying digital patterns considered “suspicious” was enough to be placed on execution lists.

Human supervision was deliberately minimal, reduced to seconds, with conscious acceptance of high error rates. Entire families were killed in their homes, treated as “acceptable collateral damage” in an algorithmic equation that normalises massacre.

This is not a technical deviation. It is a policy of extermination. International Humanitarian Law explicitly prohibits indiscriminate attacks and requires distinction between civilians and combatants.

Systems that automate lethal decisions, pre-accepting the death of innocents, constitute crimes against humanity and reinforce the characterisation of genocide as a technologically organised and rationalised process.

The machinery that sustains this model is global. Twenty-first century espionage no longer depends on intercepting messages, but on controlling digital ecosystems.

Private platforms function as permanent sensors of planetary social life, feeding databases accessible to intelligence services such as the Mossad and the CIA, through formal cooperation, legal pressure or the exploitation of vulnerabilities. This represents a structural convergence between big tech companies, the military-industrial complex and the imperial security apparatus.

“Palestine is the laboratory. In an official statement released during the war, Hamas stated on its Telegram channel that “the occupier has turned every modern tool into a weapon against the Palestinian people, using technology to justify the killing of civilians and to conceal genocide behind technical terms”

(free translation). The denunciation is clear: Israel is not waging a war against combatants, but against Palestinian existence itself, now mediated by algorithms.

Lebanese Hezbollah has warned that this model forms part of a regional hybrid war, combining digital surveillance, technological sabotage, and selective attacks.

After the attack that occurred in Lebanon in 2024, involving the coordinated explosion of pagers used by its members, Hezbollah declared through institutional channels that “the enemy has turned civilian devices into tools of assassination, proving that its war knows no ethical or human limits” (free translation). The episode revealed a new level in the weaponisation of everyday technology.

This pattern is not isolated. International investigations have already demonstrated the recurring use of military spyware against journalists, activists, and political leaders in various countries, often through smartphones widely available on the global market.

The message is unequivocal: every connected device is a potential instrument of surveillance, control, or death when inserted into the logic of imperial power.

Leaders of the Islamic Republic of Iran have been particularly outspoken. The Iranian Supreme Leader, Ayatollah Ali Khamenei, has stated in various speeches that “the Zionist regime is a cancerous tumour that uses the most modern tools to oppress and massacre peoples”.

Iranian authorities maintain that Gaza foreshadows the future of imperial domination, in a world governed by algorithmic surveillance, selective assassinations, and “clean” wars only in rhetoric.

The Lavender case thus exposes the consolidation of a digital necropolitics. Algorithms decide who lives and who dies; corporations provide the infrastructure; intelligence services operate in the shadows; and technocratic language seeks to normalise the unacceptable. Gaza bleeds so that this model may be tested, refined, and then exported.

Denouncing this machinery is a historic task. It is not merely a matter of solidarity with the Palestinian people, although that solidarity is urgent and non-negotiable.

It is about resisting a world in which data are worth more than lives, in which technology serves colonialism, and in which genocide is presented as an “algorithmic decision”. Today it is Gaza. Tomorrow, any people who dare to resist.

 

Source: https://www.middleeastmonitor.com/20260217-algorithms-and-ai-have-turned-gaza-into-a-laboratory-of-death/

Okunma: 14
  • Facebook
  • X
  • Linkedin
  • Email
  • Whatsapp
  • AI and war
  • Algorithmic targeting
  • Algorithmic warfare
  • Digital necropolitics
  • Digital Surveillance
  • Gaza War
  • Genocide Allegations
  • Hybrid warfare model
  • Imperial security apparatus
  • International Humanitarian Law
  • Lavender AI system
  • Metadata surveillance
  • Military spyware
  • Technological extermination policy
  • Tercüme
  • Translation

Sayid Marcos Tenorio

Leave a Reply Cancel reply

Your email address will not be published.

LATEST ARTICLES

  • The Courage to Be: Adapting to Unknowability in a Dangerous World
  • Algorithms and AI have turned Gaza into a laboratory of death
  • The Real European Crisis: Population
  • Geopolitics in the Age of Artificial Intelligence
  • Turkey as Israel’s “next Iran”? A strategic rivalry reconsidered

CATEGORIES

  • Analysis
  • Archive Selections
  • Economy
  • Editorials
  • History
  • Interview
  • Kritik Books
  • Kritik Dictionary
  • Opinion
  • Politics
  • World
February 25, 2026

The Courage to Be: Adapting to Unknowability in a Dangerous World

February 25, 2026

The Real European Crisis: Population

February 24, 2026

Turkey as Israel’s “next Iran”? A strategic rivalry reconsidered

February 24, 2026

El Mencho’s killing won’t solve Mexico’s cartel problem

© 2024. All Right Reserved

Test
  • Columnist
  • Opinion
  • World
  • History
  • Interview
  • About Us
  • Articles
  • Imprint
  • Contact
  • Privacy Policy
  • Personal Data Protection Law

Mobile Apps

Onayı Yönet
En iyi deneyimleri sunmak için, cihaz bilgilerini saklamak ve/veya bunlara erişmek amacıyla çerezler gibi teknolojiler kullanıyoruz. Bu teknolojilere izin vermek, bu sitedeki tarama davranışı veya benzersiz kimlikler gibi verileri işlememize izin verecektir. Onay vermemek veya onayı geri çekmek, belirli özellikleri ve işlevleri olumsuz etkileyebilir.
Fonksiyonel Always active
Teknik depolama veya erişim, abone veya kullanıcı tarafından açıkça talep edilen belirli bir hizmetin kullanılmasını sağlamak veya bir elektronik iletişim ağı üzerinden bir iletişimin iletimini gerçekleştirmek amacıyla meşru bir amaç için kesinlikle gereklidir.
Tercihler
Teknik depolama veya erişim, abone veya kullanıcı tarafından talep edilmeyen tercihlerin saklanmasının meşru amacı için gereklidir.
İstatistik
Sadece istatistiksel amaçlar için kullanılan teknik depolama veya erişim. Sadece anonim istatistiksel amaçlar için kullanılan teknik depolama veya erişim. Mahkeme celbi, İnternet Hizmet Sağlayıcınızın gönüllü uyumu veya üçüncü bir taraftan ek kayıtlar olmadan, yalnızca bu amaçla saklanan veya alınan bilgiler genellikle kimliğinizi belirlemek için kullanılamaz.
Marketing
Teknik depolama veya erişim, reklam göndermek için kullanıcı profilleri oluşturmak veya benzer pazarlama amaçları için kullanıcıyı bir web sitesinde veya birkaç web sitesinde izlemek için gereklidir.
Manage options Manage services Manage {vendor_count} vendors Read more about these purposes
Tercihleri görüntüle
{title} {title} {title}
  • Homepage
  • Columnists
  • Politics
  • Opinion
  • World
  • Economy
  • History
  • Analysis
  • Interview
  • EDITIONS
    • Türkçe
    • English
    • Francais
    • Deutsch
    • Español
    • العربية