Show Justice
IHL’s Data Practices and the Bureaucracy of Killing
“Colonel Powell, Ma’am, I am the pilot in command responsible for releasing the weapon. I have the right to ask for the CDE to be run again. I will not release my weapon until that happens.”
With these words, Second Lieutenant Steve Watts, a drone pilot, refused to release his weapons on a terrorists’ compound in Nairobi, Kenya, demanding that the ‘CDE’ – the collateral damage estimate algorithm – will be recalculated, this time taking into account the new visuals produced by the drone’s sensors, which captured a nine-year-old girl near the targeted compound.
Steve is not a real drone pilot. He is a fictional character created by screenwriter Guy Hibbert in his screenplay for the British action-thriller, ‘Eye in the Sky’. Steve’s insistence to hold fire until the CDE is recalculated based on the new drone visuals is nonetheless significant. Through his actions and expressions, Steve establishes mundane data practices as a system of knowledge production through which International Humanitarian Law (IHL) actors exercise jurisdiction over people and processes, time and space. Viewed through these lenses, IHL is not only a set of legal norms, rules, and principles designed to guide behaviour during armed conflicts. It is also a set of data practices that classify individuals into legal categories such as ‘combatant’ and ‘civilian’, establish levels of dangerousness, and determine – using predictive technologies – factual conditions on the ground (including how many bystanders will be killed in an attack).
Representations of IHL’s Data Practices and its Evolving Proto-Jurisdiction
This evolving aspect of IHL’s jurisdiction – referred to by Fleur Johns, in another context, as proto-jurisdiction – relates to the exercise of jurisdiction through data collection and construction by legally authorized agents, such as Steve, the drone pilot. Collecting and interpreting data captured through various visual and predictive technologies, Steve observed a young girl, determined her legal status, and insisted that her fate must be determined by a further calculation of a sophisticated collateral damage algorithm. By refusing to release his weapons until the CDE is recalculated, Steve turned to IHL’s data practices as the embodiment of legal authority and justice. Steve’s heroism was expressed through obeying the CDE algorithm, killing 9-year-old Alia, against his own moral and legal judgment. Data practices such as these are the main force advancing ‘Eye in the Sky’s plot and are presented as the pinnacle of modern IHL.
‘Eye in the Sky’ features many advanced technological capabilities, including drone imaging, facial recognition technologies, short-range surveillance cameras, and collateral damage algorithms, and showcases their centrality to the application of IHL. Despite the centrality of these technologies in the movie’s plot, the various ways in which they extend IHL’s jurisdiction and affect decision-makers are neither questioned nor critically explored. Instead, the technology is used as an Archimedean Point from which the just and true nature of IHL practices can be observed. The drone operators’ interpretation of the visuals, or the predictions of the algorithms, are not questioned; and the methodologies producing these outputs remain invisible, even when it becomes clear that they entail at least some level of uncertainty and inaccuracy.
The Conservative Trend in Just War Cinema and the Western Narrative of the Bureaucracy of Killing
‘Eye in the Sky’ is far from being ‘accurate’ or ‘objective’ (see already here) as claimed by IHL experts (see only here and here). Instead, it adopts and advances an ideological narrative, presenting existing IHL’s data practices as a higher authority in IHL decision-making; and constructing compliance with – and submission to – these data practices as the highest form of modern military heroism. By doing so, it contributes to and participates in the growing conservative cinematic trend in war movies, while masking this ideological stance as neutral and natural.
Drawing on the works of TWAIL scholars (see here, here, and here), it becomes visible that ‘Eye in the Sky’s representations of IHL’s proto-jurisdiction reflect and reinforce a particular IHL narrative, which is consistent with Western countries’ narrative about their existing bureaucracy of killing. For example, while Kenyan Special Forces are ready to launch a ground attack on the compound, Western decision-makers dismiss this option as inferior to their alternative – a technology-based airstrike. Though not glorifying war per se, ‘Eye in the Sky’ justifies Western countries’ data practices and focuses on Western decision-makers and their perceived dilemmas. At the same time, the construction of these dilemmas is based on several unsubstantiated assumptions, including that existing IHL data practices are just, accurate, and protective. The construction of these dilemmas also marginalise African decision-makers’ expertise and preferences, and evade necessary questions concerning IHL’s legitimation of violence and domination of third world countries and peoples (presenting Nairobi as a ‘friendly’ city in need of protection, and allowing only Western decision-makers to ‘speak IHL’).
Reshaping Core IHL Concepts through Data Practices
Similarly, ‘Eye in the Sky’ uses IHL’s data practices to reshape the concept of protection: when Colonel Powell orders her targeteer, Sergeant Mushtaq Saddiq, to ‘Do whatever you can to save this girl’s life’, she means that he should find a way to amend the algorithmic calculation so that in the parallel realm generated by IHL data practices, Alia’s chances of survival will increase to a certain pre-determined threshold. After the CDE algorithm elevates Alia’s chances of survival to 65 percent, Colonel Powell declared that ‘We have now done everything in our power to give this girl a chance to survive’. In Powell’s eyes – reflecting Western countries’ narrative – this modification of chances and prediction through data practices is a constitutive exercise: whatever fate befalls Alia – the technology has given her a ‘chance’ to survive. ‘Everything in our power’ is thus reduced to fine-tuning of the algorithmic prediction. Other – non-technology-based – courses of action, such as cancelling the operation, using ground forces (which in this case were ready and willing), attacking the target on its way to location, or consulting with local authorities, were eliminated.
Additionally, by focusing on a deliberate manipulation of the technology-generated output by a ‘rotten apple’ – rather than an inherent element of IHL’s data practices and the bureaucracy of killing – ‘Eye in the Sky’ constructs IHL’s data practices as a pure system of knowledge production. The fact that the original prediction, which was presented as ‘correct’ and ‘true’, was also erroneous (an accurate assessment would have predicted Alia’s death with 100 percent certainty), was not considered at all. Other limitations of IHL’s data practices were similarly ignored. The final scenes of the movie demonstrate the temporality and rigidity of these data practices: once the first strike was launched – destroying the compound, killing all but one of the terrorists, and injuring Alia – the factual basis for the legal assessment changed. The terrorists’ plans were frustrated and the risk to Alia’s life increased (as it was now clear that she is in the range of fire, and cannot flee the scene as she lies injured on the ground). But the virtual legal reality remained static, unmoved, unquestioned, as Colonel Powell ordered a second strike on the compound; a strike that killed the remaining living terrorist, as well as Alia.
‘Eye in the Sky’ was released a few months after British and US forces attempted to target and kill an ISIS operative – Junaid Hussain – in Northern Syria, killing, instead, three innocent bystanders and wounding five; and shortly after a US aircraft misidentified and bombed Doctors Without Borders’ hospital in Kunduz, Afghanistan, killing forty-two people, mostly medical staff and patients. Viewed within this context, the image of IHL’s data practices as a true representation of reality seems particularly problematic. An accurate and authentic representation of IHL’s data practices would have revealed – and challenged – their imperfections and jurisdictional implications, their political and cultural predispositions, and their legitimation of violence and domination.
This post is part of a symposium on the topic of this year’s AjV-DGIR conference on ‘Jurisdiction – Who Speaks International Law?’, to be held physically in Bonn and online via zoom. The contributions to the symposium are shorter versions of papers that will be presented during the conference. The full programme is available here. You can register as an engaged listener (online only) here.
Shiri Krebs is an Associate Professor of Law at Deakin Law School. She is Co-Lead of the Law and Policy Theme within the Cyber Security Cooperative Research Centre (CSCRC) and an Affiliate Scholar at the Stanford Center on International Security and Cooperation (CISAC).