AI on the Battlefield and at the Peace Table

I have a new piece at the National Security Journal (the publication complexities of my column are way too complicated to get into right now, trust me) about the latest variant of the Su-57 Felon. This variant is notable for the integration of AI tools into the process of flying, information integration, and targeting. For those familiar with how AI is getting integrated into the war efforts in Russia-Ukraine and Israel-Hamas, it’s interesting stuff that may be a harbinger for how we think about the collaborative relationship between manned and unmanned aviation in the near- to medium- future.
The idea of an AI assistant for the pilot of a single-seat fighter jet makes a great deal of sense. Dan Hampton, an F-16 pilot and the author of Viper Pilot, once described flying a fighter in combat as playing all the instruments in a rock band at the same time. Western fighter design and pilot training have focused on facilitating this process and making it easier for the pilot to understand and influence his or her environment. But in this analogy, the AI assistant could relieve some of the complexity by taking over the drums and bass while the pilot focuses on fronting the band.
The AI assistant may also be key to the export prospects of the Su-57. There is often a gulf between the domestic and export variants of Russian military equipment. This protects Russian technology. Nevertheless, innovations often find their way, in some form, into the export versions. Less capable versions of the AI assistant could be part of an export package for either the Su-57 or for other, older Russian fighter aircraft.
If Russia is willing to be sufficiently loose with its controls of the technology, the AI assistant could be of tremendous interest to second-tier air forces that often lack the resources to train pilots to the most strenuous extent. The assistant could help such air forces close the gap with their most advanced competitors, while also opening up the potential for accomplishing complex missions that place the greatest demands on pilots.
With respect to AI in war more generally… I am increasingly coming to the conclusion that there’s no way to put the genie back into the bottle, a conclusion that is applicable to the use of AI in academia and more generally. In Ukraine, AI is solving the “last 100 meters” problem that afflicts traditional remotely operated drones. David Kirichenko:
Ukraine is already locked in an AI-driven drone race against Russia, with both sides leveraging autonomous technologies to gain an edge on the battlefield. Faced with Russia’s numerical superiority, Ukraine turned to drones early in the war, forcing Moscow to follow suit. As Russia advanced its electronic warfare capabilities—jamming Ukrainian drones as a result—both sides were pushed to innovate at an accelerated pace. In the Russian-Ukrainian war, drones now cause most battlefield casualties, accounting for approximately 70 percent of total deaths and injuries.
This cat-and-mouse game led both sides to adopt fiber-optic cables to bypass jamming. Unsurprisingly, countermeasures to disrupt these cables are already in development.
Now, the next phase of drone warfare is taking shape: AI-powered targeting systems designed to operate even in heavily jammed environments, allowing drones to identify and strike targets with minimal human intervention.
What all of this means for war crimes and for civilian protection is not clear. Breathless claims about how taking a human out of the loop enables war crimes need to be balanced against the fact that humans commit war crimes all the time, and it is possible that the levels of communication and surveillance that AI allow for will enhance accountability. See for example Emilia Probasco and Minji Jang:
While it is difficult to believe, the My Lai massacre would have been worse had it not been interrupted by a U.S. Army helicopter crew led by then-Warrant Officer Hugh Thompson, Jr. Thompson witnessed the actions of C Company soldiers while circling above the village. At several points during the massacre, Thompson landed his helicopter to help the locals in an attempt to stop the killing, challenging Calley’s orders directly.
Now imagine a future battlefield, with soldiers as emotionally charged or misguided as those under Calley’s command. But on this future battlefield, Hugh Thompson’s counterpart might not be there. Instead, a drone will likely be flying overhead.
Could that drone play the same role as Hugh Thompson did in the My Lai massacre? This is a complex question military leaders must begin to confront.
Also worth noting that AI has certain applications for peacebuilding. The State Department is using AI tools to create more effective simulations that are crucial for the training of diplomats. As or more significant, there are efforts underway to integrate AI into peace negotiations:
The CSIS programme is led by a unit called the Futures Lab. This team developed an AI language model using software from Scale AI, a firm based in San Francisco, and unique training data. The lab designed a tabletop strategy game called “Hetman’s Shadow” in which Russia, Ukraine and their allies hammer out deals. Data from 45 experts who played the game were fed into the model. So were media analyses of issues at stake in the Russia-Ukraine war, as well as answers provided by specialists to a questionnaire about the relative values of potential negotiation trade-offs. A database of 374 peace agreements and ceasefires was also poured in.
Thus was born, in late February, the first iteration of the Ukraine-Russia Peace Agreement Simulator. Users enter preferences for outcomes grouped under four rubrics: territory and sovereignty; security arrangements; justice and accountability; and economic conditions. The AI model then cranks out a draft agreement. The software also scores, on a scale of one to ten, the likelihood that each of its components would be satisfactory, negotiable or unacceptable to Russia, Ukraine, America and Europe. The model was provided to government negotiators from those last three territories, but a limited “dashboard” version of the software can be run online by interested members of the public.
Large scale peace negotiations, especially ones that involve territory, require a huge amount of diplomatic work and in-depth command of geographic detail. AI can help with that as long as it remains linked to knowledgeable, competent human negotiating teams. In any case, countries that are at war (and are sometimes haltingly trying to find their way to peace) are using the tools that they have available, and AI is available.
Photo Credit: By Anna Zvereva from Tallinn, Estonia – Sukhoi Design Bureau, 054, Sukhoi Su-57, CC BY-SA 2.0, https://commons.wikimedia.org/w/index.php?curid=87441875