The Israeli Air Force’s Use of Artificial Intelligence in Gaza: The Future of Military Decision-Making

The Israeli Air Force’s Use of Artificial Intelligence in Gaza: The Future of Military Decision-Making

In the ongoing Israel-Palestine conflict, the military use of Artificial Intelligence (AI) is a major development that will change the character of warfare. The integration of AI into the decision-making process by the Israeli Defence Force (IDF) indicates that military decision-making will at least be heavily informed by AI systems in the future. The use of AI by the Israeli Air Force (IAF) has made its airstrikes in Gaza faster, greater in quantity, more precise, and significantly more destructive.

As early as 2021, the IAF had used multiple AI systems in its military operations in Gaza. One such system, called Habsora (Hebrew for ‘the Gospel’), played a major role in the IAF’s operations. Gospel generated information for Israel’s Military Intelligence, which helped in identifying targets for the IAF to strike in real time. This highlights the role of AI as a “force multiplier for the entire IDF”, as recognised by a senior military officer, who stated that AI enabled the IDF to identify targets more effectively and quickly using super-cognition. This indicates the changing character of warfare, which all states must prepare for. Considering the potential widespread use of AI systems in future wars, states need to focus on 4 key aspects, which were brought into light during the Israel-Palestine conflict.

Firstly, the use of AI has significantly increased the amount of airstrikes carried out by the IAF. In previous conflicts, the IAF would run out of targets within a few weeks. Gospel, however, was able to process tremendous amounts of data and generate hundreds of new targets per day, a figure that is astronomically greater than what human officers could produce. Gospel used AI to digitize the battlefield, and generate targets from multiple sources, including “drone footage, intercepted communications, surveillance data, open source information, and data from monitoring the movements and behavior of both individuals and large groups”. The former head of the IDF, Aviv Kochavi, stated that Gospel produced 100 targets a day in Israel’s 2021 military operations in Gaza, with “50% of them being attacked”. The use of AI has led to a drastic increase in the amount of IAF airstrikes. This is worrying, as it indicates that the use of AI in warfare will lead to an increase in the use of force.

Secondly, the use of AI systems on the battlefield could lead to the further dehumanization of warfare. AI has significantly increased the number of civilians and non-military sites targeted in airstrikes by the IAF. The IAF has conducted airstrikes on homes, hospitals, schools, and other important infrastructure. Gospel allows the IAF to carry out airstrikes on buildings where even no Hamas member might be present if the AI system declares it to be a strategic target. In this way, the use of AI has allowed the IAF to feel justified in its military aggression. Since October 7th, Israel has killed over 40,000 Palestinians through airstrikes and other military operations. Within the first few weeks of Israel’s 2023 military operations, civilians made up 61% of the deaths from the IAF’s airstrikes in Gaza. In this way, the use of AI on the battlefield could lead to a gradual erosion of International Humanitarian Law (IHL).

Thirdly, and perhaps most importantly, AI has increased the likelihood of conflict escalation in Gaza. Due to their increased distance from the actual battlefield, the human operators controlling these AI systems have treated their adversaries as numbers on a screen, rather than actual human beings. There is a great likelihood that humans will become over-reliant on these AI systems without truly understanding how they function. Ultimately, as the use of AI systems in warfare increases, humans are more likely to trust the information they generate. As the use of Gospel by Israel has shown, AI is not perfect, and can certainly be the cause for increased violence and escalation. In a world with 9 nuclear-armed states, such escalation simply cannot be afforded.

Lastly, the integration of AI within future militaries will greatly increase the fog of war. Major states are all now heavily investing in AI, and all the trends indicate that the decision-making process of militaries in the future will be heavily assisted by AI. As the speed of warfare increases drastically and the use of AI systems results in the digital dehumanization of warfare, there will be a greater risk of miscalculation. States need to prepare for this coming reality. The global community, too, must increase its efforts to stop the potentially catastrophic risks associated with military AI systems, before a major incident occurs. The same holds for Pakistan, given the Indian military’s increasing focus on AI. Pakistan must be ready for the changing character of warfare in the years to come.

Shayan Hassan Jamy
+ posts

Shayan Hassan Jamy is a Researcher at the Center for Aerospace and Security Studies (CASS), Lahore. He can be reached atinfo@casslhr.com.

Share this post :

Facebook
Twitter
LinkedIn
Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *

Latest News

Subscribe our newsletter

Sign up our newsletter to get update information, news and free insight.