“Write down!
I am an Arab
My identity card number is 50,000
I have eight children.
The ninth will come after a summer
Will you be angry?”
—Mahmoud Darwish, “ID Card”, Olive Leaves, 1964
On 29 October, a video of a young Palestinian in Gaza went viral on TikTok. He holds a clip-on microphone and greets his audience after a period of internet and communication disruption. A soundtrack of broadcast music has been added in the video's post-production. Behind him, people line up in what he describes as a queue for bread at one of the few bakeries still operating in the north of the Gaza Strip. On the video's feed, an infinite scroll of comments praise and bless Abboud, who despite his young age, has become an improvised and widely followed journalist in this war. Images and videos of queues at bakeries circulated during October as power cuts and fuel shortages prevented Palestinians from cooking at home. Some reached the international media, but most were shared and liked thousands of times by a global audience following Palestinians as they witnessed the collapse of the conditions of life in Gaza. On 4 November, AlJazeera published a video of a bombing next to a bakery, both videos can be geolocated a few meters apart from each other.
In the period between October and early November, Forensic Architecture documented the destruction of seventeen bakeries in the Gaza strip.1 In some cases, such as the al-Sharq bakery, the surrounding area was bombed while people were queuing, damaging the building and injuring civilians; in others, such as the al-Jadeed bakery, the target was the bakery itself, killing several people and leaving a crater in place of the building. Visual evidence shows the targeting of solar panels, such as those of Families Bakery, making it impossible to produce bread. These appear to be deliberate strikes, as houses and infrastructure in these areas were not targeted at the time, although much of it was later destroyed. Bakeries were already part of the pre-existing network of humanitarian aid distribution – the Nuseirat bakery had reportedly received flour from UNRWA just hours before it was bombed. With bread being one of the few products available, bakeries became an essential civilian infrastructure in the early weeks of the war. They were bombed north of Wadi Gaza and in towns in the centre of the Strip, with more than half of the attacks taking place in the two weeks following Israel's 13 October evacuation order, which ordered 1.1 million Palestinians to leave their homes in northern Gaza.
The repeated destruction of bakeries indicates a shift in the logic used by Israel to justify its actions under international humanitarian law (IHL). In previous wars, an economic approach to minimising of violence underpinned military strategy: determining how many civilian casualties were acceptable in order to eliminate a target. International law, intended to moderate violence, plays a key role in calculating and managing this economy of violence.2 In the 2021 Gaza war, for example, the Israeli Defence Forces (IDF) argued that warning phone calls made the strikes on high-rise buildings legal. The IDF presented these warnings as measures of proportionality, claiming that, by alerting residents to evacuate minutes before the bombs leveled their homes, harm was minimised. Within this economy, another option is available: when the ratio of civilians to militants is too high, as it is in urban space, where populations live, then civilians can be reclassified as combatants to redress the imbalance. The scale of destruction of physical and social infrastructure in the current war, including the targeted killing of health workers, civil servants and civil society organising shelter or aid distribution, suggests the use of this strategy. By broadening who and what constitutes a military target, mass killings take place under the guise of legal and strategic necessity. Arguably, the fastest way to turn civilians into combatants and civilian infrastructure, such as bakeries, into military bases on a massive scale is through the use of AI.
In May 2024, the Israeli Air Force and the Intelligence Division were awarded the 2024 Israeli Defence Prize for the success of their "target factory"3. According to an official government statement, these divisions were recognised for their innovative use of "advanced algorithms and AI" to identify potential military targets that appear to be civilians.4 A senior military official told the Jerusalem Post earlier this year that, for the first time, the army's AI systems are allowing it to generate new targets faster than it can strike them.5 He noted that in the 2014 and 2021 wars, the IDF had exhausted its list of targets. Previously, finding and justifying new targets seemed to be a bottleneck in Israeli military operations. Now, AI is automating this justification process, enabling continuous airstrikes without interruption. This marks a profound shift in war and security where targets (or criminals) can now be identified faster than they can be eliminated or apprehended. In this sense, military and security agents are becoming processors of information, adapting their tactics to generate new information that justifies their actions, overshadowing any critical examination of the underlying premises of those actions.
Brigadier General Yossi Sariel, commander of Intelligence Unit 8200 and forerunner of the "target factory," advocates for the next phase of human evolution in his self-edited master's thesis, The Human-Machine Team.6 Written during his year abroad at the National Defense University in Washington, D.C., Sariel argues that armies should leverage "synergic learning between humans and machines to create supercognition."7 In his thesis, Sariel outlines a model that generates targets by integrating classified data, such as intelligence from informants, with the identification of individuals who are members of a WhatsApp group associated with a known militant. Even easier than tracking group members, a breach in Whatsapp's encryption, revealed by the Intercept in May, would be enough for the Israeli government to track the metadata of Whatsapp messages — who contacted whom, when and from where — and extrapolate from these interactions a social network that can generate targets through machine learning.8 Although the Israeli settler-colonial regime has been conducting social network analysis on Palestinians for decades through registries, databases and by policing and arresting of the population, large-scale data processing capabilities allow this analysis to be automated and accelerated. This technological development makes it possible to create connections that allow to recast any member of the population as a terrorist.
Following the 13 October evacuation order, anyone who stayed north of Wadi Gaza was portrayed by the IDF as a potential terrorist. This overshadowed the fact that Palestinians chose to stay, remembering their grandparents' regret at leaving their land during the Nakba in 1948. With no electricity or fuel, queues at bakeries stretched for hours. Someone standing in line at a bakery holding a smartphone linked by the target factory to that of an alleged Hamas militant, turned a civilian infrastructure into a military target. With the legal and strategic justification automatically laid out, the choice of whether or not to bomb that bakery is the human element in the human-machine team. At the beginning of the war, the destruction of bakeries served to push Palestinians towards the south, eventually to Rafah, as part of Israel's plan to pressure Egypt into opening its borders. The human choice, aligning with the broader political strategy, was then to bomb the bakery.
To uncover patterns and connections that might not be immediately evident within such a network, biometric checkpoints were established throughout the Gaza Strip. By 14 November, a makeshift checkpoint was set up on Salah al-Din Street to facilitate the Israeli military's monitoring and control of the mass transfer of civilians to the south. According to testimonies collected by the UN Office for the Coordination of Humanitarian Affairs (OCHA), this site was a place of arbitrary arrests, forcible family separations, humiliation, and other forms of psychological and physical violence inflicted by the Israeli military. OCHA also reported that the checkpoint was unmanned and remotely controlled by armed soldiers nearby, and included a surveillance system: "Displaced persons are asked to show their IDs and undergo what appears to be a facial recognition scan".9 The forced transfer of the Palestinian population has thus enabled the involuntary collection of their biometric data.
In March, satellite images showed displaced Palestinians stopped at a makeshift Israeli military checkpoint to the entrance of the Al-Mawasi "safe zone". If the Israeli military uses this checkpoint similarly to the one on Salah al-Din Street, it would mobilise this "safe zone" as a selective border to control the movement and concentration of the displaced Palestinian population. Since the beginning of the new year, two permanent checkpoints have been established, dividing the Gaza Strip in two, and are referred to by the IDF as "drains"[נקזים).10 These structures not only act as filters to determine who passes and who is arrested, they also serve as registers that can be cross-referenced with existing records to track the names of those who have chosen to remain in the north and potentially fabricate new targets.
The aim of the "target factory" is to provide the legal justification for the mass killing of Palestinians. The constraint of minimising civilian deaths in war is resolved by the mechanism of turning civilians into targets in genocide. The question is not whether AI is good or bad for war, or whether it could be refined to be fairer. The bias in the "target factory" is a political one, pre-existing the elaboration of its data models: that international law integrates an economy of violence into the logic of violence, and that these calculations, when run millions of times, can be used to deliberately destroy the living conditions of a group. International law and the computational power of AI have combined to structure and legitimise the criminalisation of a population that chooses not to leave its homeland. But, with the destruction of bakeries, came the building of taboons (طابون), traditional Palestinian clay ovens, came doctors kneading bread on surgical towels in hospitals, came women in shelters making arbood (خبز غير مخمر), bread baked in ash as done in the Bedouin shepherd tradition, and came farmers planting seedlings to fight hunger. These practices, which are deeply rooted in the Palestinian forms of life and knowledge of land, remind us that another kind of network, that of cultural and communal ties, thrives within and against computational destruction.
[1] https://gaza-aid-attacks.forensic-architecture.org/
[2] Eyal Weizman, The Least of All Possible Evils. A Short History of Humanitarian Violence (London: Verso, 2011).
[3] איתי בלומנטל, ‘“קלע דוד”, נגמ"ש הנמר ו"מפעל המטרות": אלה הזוכים בפרס ביטחון ישראל 2024’, כאן | תאגיד השידור הישראלי, accessed 31 July 2024, https://www.kan.org.il/content/kan-news/defense/755144/.
[4] ‘יותר מ-12,000 מטרות ושיתוף פעולה ראשון מסוגו: הצצה למפעל המטרות של צה"ל הפועל מסביב לשעון’, את"צ, accessed 13 June 2024, https://www.idf.il/אתרי-יחידות/יומן-המלחמה/כל-הכתבות/הפצות/מלחמה-מטרות-שהותקפו-כוחות-צה-ל-אגף-המודיעין-חיל-האוויר-חיל-הים/; Yuval Abraham, ‘“Lavender”: The AI Machine Directing Israel’s Bombing Spree in Gaza’, +972 Magazine, 3 April 2024, https://www.972mag.com/lavender-ai-israeli-army-gaza/.
[5] ‘IDF Bombs Whole Gaza Neighborhoods to Hit Hamas Targets - Official’, The Jerusalem Post | JPost.com, 11 October 2023, https://www.jpost.com/israel-news/defense-news/article-767706.
[6] Harry Davies and Bethan McKernan, ‘Top Israeli Spy Chief Exposes His True Identity in Online Security Lapse’, The Guardian, 5 April 2024, sec. World news, https://www.theguardian.com/world/2024/apr/05/top-israeli-spy-chief-exposes-his-true-identity-in-online-security-lapse.
[7] Brigadier General Y.S, The Human-Machine Team. How to Create Synergy between Human & Artificial Intelligence That Will Revolutionize Our World, 2021. p.17
[8] Sam Biddle, ‘This Undisclosed WhatsApp Vulnerability Lets Governments See Who You Message’, The Intercept, 22 May 2024, https://theintercept.com/2024/05/22/whatsapp-security-vulnerability-meta-israel-palestine/.
[9] ‘Hostilities in the Gaza Strip and Israel | Flash Update #48 [EN/AR/HE] - Occupied Palestinian Territory | ReliefWeb’, 24 November 2023, https://reliefweb.int/report/occupied-palestinian-territory/hostilities-gaza-strip-and-israel-flash-update-48-enarhe.
[10] ‘התוכנית של צה"ל למניעת חזרת חמאס לצפון הרצועה - והביקורת: “תסכן חיילים” - וואלה חדשות’, וואלה, 4 February 2024, http://news.walla.co.il/item/3641108.