0% found this document useful (0 votes)
33 views40 pages

The Future Digital Battlefield

The document discusses the transformative impact of digital technologies on warfare, highlighting elements such as cyber capabilities, AI, and robotics that characterize the future digital battlefield. It emphasizes the legal and ethical implications of these advancements for humanitarian protection in armed conflicts, suggesting that existing frameworks may struggle to address the challenges posed by these technologies. The paper aims to provide a comprehensive overview of these issues to inform ongoing debates among scholars and policymakers.

Uploaded by

tian tian
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
33 views40 pages

The Future Digital Battlefield

The document discusses the transformative impact of digital technologies on warfare, highlighting elements such as cyber capabilities, AI, and robotics that characterize the future digital battlefield. It emphasizes the legal and ethical implications of these advancements for humanitarian protection in armed conflicts, suggesting that existing frameworks may struggle to address the challenges posed by these technologies. The paper aims to provide a comprehensive overview of these issues to inform ongoing debates among scholars and policymakers.

Uploaded by

tian tian
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PDF, TXT or read online on Scribd

WORKING PAPERS

The Future Digital Battlefield


and Challenges for Humanitarian
Protection: A Primer

HENNING LAHMANN
APRIL 2022
TABLE OF CONTENTS
1. Introduction............................................................................................................................................................. 1
2. Elements of the Future Digital Battlefield: Technology Overview ......................................................... 3
2.1 Offensive Cyber Capabilities ....................................................................................................................... 3
2.2 Information Warfare ..................................................................................................................................... 4
2.3 Artificial Intelligence..................................................................................................................................... 5
2.3.1 Lethal Autonomous Weapons Systems ........................................................................................... 6
2.3.2 Unmanned Aerial Vehicles and Drone ............................................................................................ 7
2.3.3 Intelligence, Surveillance and Reconnaissance (ISR) and Fusion Architectures ................ 9
2.3.4 Targeting ................................................................................................................................................ 11
2.3.5 Cyber Warfare and AI......................................................................................................................... 12
2.3.6 Information Warfare and AI ............................................................................................................ 12
2.4 Robotics and Sensor Technologies ......................................................................................................... 13
2.5 Space Technologies ..................................................................................................................................... 14
2.6 Human Enhancement Technologies ..................................................................................................... 15
2.7 Conclusion: Convergent Effects .............................................................................................................. 16
3. Implications for Humanitarian Protection ................................................................................................. 17
3.1 Threshold Questions ................................................................................................................................... 18
3.2 Utilisation of Data and the Emergence of the Military Surveillance Paradigm ....................... 20
3.3 The Spatial and Temporal Dissolution of the Conflict Zone .......................................................... 24
3.4 States’ Positive Obligations Concerning Vulnerabilities of Digital Warfare Technologies . 25
3.5 Human Control: Questions Pertaining to Accountability and Responsibility ......................... 27
4. Concluding Remarks.......................................................................................................................................... 28
Bibliography.............................................................................................................................................................. 29
amounts of data generated with the help of
1. INTRODUCTION significant advances in unmanned aerial vehicle
(UAV), space,5 and sensor technologies,6 fused in
Novel digital technologies are set to real time with further digital data sources such as
revolutionise the ways wars are fought. Recent social media activity, online behaviour, and
situations of armed conflict, both between states other “publicly available information”7 as well as
and between states and non-state actors, have mobile communications data, have begun to
revealed glimpses into this future of warfare. create ecosystems of constant military
Noteworthy examples are the sustained surveillance.8 The far-reaching legal and ethical
campaign of tactical drone strikes by Azerbaijan implications of these developments for affected
against Armenian forces,1 the purported civilian populations are still in the early stages of
deployment of autonomous armed drones in being properly analysed and understood. At the
Libya,2 or the use of AI-supported intelligence, same time, the use of cyber tools continues to
surveillance, and reconnaissance (ISR) make inroads among state actors, both as an
3
technologies, including with the support of element of military operations during armed
drone swarms,4 by the Israeli Defence Forces conflict and as part of ongoing, low- to mid-
during its latest military campaign in Gaza. intensity encounters between great powers
The increased employment of algorithmic during peacetime.9 Here, too, much is left to be
decision-making systems that utilise vast examined to properly assess the potential

1
Crabtree J, ‘Gaza and Nagorno-Karabakh Were Glimpses of State Monopolies on Information’ [2021] The Economist
the Future of Conflict’ [2021] Foreign Policy <[Link]
<[Link] source-intelligence-challenges-state-monopolies-on-
karabakh-future-conflict-drones/>; Gady F-S and Stronell A, information>: “[Sensors] may also show things which,
‘What the Nagorno-Karabakh Conflict Revealed About wavelength-restricted as their own eyes are, human
Future Warfighting’ (World Politics Review, 19 November interpreters have yet to imagine. It is in part to guard against
2020)<[Link] missing such things that satellite images are increasingly
9/what-the-nagorno-karabakh-conflict-revealed-about- fed into machine-learning software which will see patterns
future-warfighting>. humans might not pick out, or even think to look for.“
2 7
Letter dated 8 March 2021 from the Panel of Experts on See Smagh NS, ‘Intelligence, Surveillance, and
Libya established pursuant to resolution 1973 (2011) Reconnaissance Design for Great Power Competition’
addressed to the President of the Security Council, UN Doc. (Congressional Research Service 2020) R46389
S/2021/229, 8 March 2021, para. 63, <[Link] p. 16 and 31.
[Link] 8
Scoles S, ‘It’s Sentient: Meet the Classified Artificial Brain
3
Dar Y, ‘Israel Says It Fought World’s First “Artificial Being Developed by US Intelligence Programs’ [2019] The
Intelligence War” Against Hamas’ The Eurasian Times (29 Verge
May 2021) <[Link] <[Link]
worlds-first-artificial-intelligence-war-against-hamas/>; national-reconnaissance-office-spy-satellites-artificial-
Ben-Yishai R, ‘How Data and AI Drove the IDF Operation in intelligence-ai>; Schultz RH and Clarke RD, ‘Big Data at
Gaza’ YNet News (29 May 2021) War: Special Operations Forces, Project Maven, and Twenty-
<[Link] First Century Warfare’ (Modern War Institute, 25 August
>. 2020) <[Link]
4
operations-forces-project-maven-and-twenty-first-century-
Gross JA, ‘In Apparent World First, IDF Deployed Drone
warfare/>.
Swarms in Gaza Fighting’ The Times of Israel (10 July 2021)
9
<[Link] Van der Waag-Cowling N, ‘Stepping into the Breach:
idf-deployed-drone-swarms-in-gaza-fighting/>. Military Responses to Global Cyber Insecurity’ (ICRC
5
Humanitarian Law & Policy, 17 June 2021)
Reed J, Routh A and Mariani J, ‘Information at the Edge: A
<[Link]
Space Architecture for a Future Battle Network’ (Deloitte
cyber-insecurity/>; Stacey E, ‘The Future of Cyber Warfare –
Insights, 16 November 2020)
An Interview with Greg Austin’ (Strife, 26 April 2020)
<[Link]
<[Link]
c-sector/[Link]>.
cyber-warfare-an-interview-with-greg-austin/>.
6
The Economist, ‘Open-Source Intelligence Challenges

1 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
consequences for civil societies.10 the convergent effects of the different
Complemented by important developments in technological trends in the areas of AI, cyber,
robotics and perhaps at some point in the future space, robotics, drones, and sensor systems,
even extending to cybernetically enhanced asking what the “future digital battlefield” might
human soldiers,11 the progressing digitisation of entail for the protection of affected individuals
military technology is set to involve and societies as provided by existing legal
transformative shifts in the ways in which future frameworks.
conflicts will play out in inter-state To this end, the paper first provides an
constellations as well as in regard to conflicts overview of the various technologies that, taken
between states and non-state armed groups. together, constitute the “future digital
These novel digital technologies will inevitably battlefield”, with a specific focus on those areas
shape how political and military decision- that have not received as much attention to date.
makers conceive the possibilities and constraints On the basis of these partially speculative
of contemporary warfare and thus impact policy descriptions, the second part highlights a few
in relation to future conflict.12 This, in turn, legal subject areas that entail some of the
implies that existing international legal potentially most consequential implications for
frameworks will come under increasing pressure future humanitarian protection in armed
to be responsive to this momentous conflict. Rather than presenting fully formed
development, in particular with a view to future answers to the questions posed by novel digital
humanitarian protection needs. technologies in the military or an in-depth legal
To date, international legal scholarship has analysis of each identified challenge, the purpose
primarily addressed issues that encompass of this primer is to provide an informed and
important but limited aspects of the digitisation critical outline of the most urgent issues
of conflict, most prominently the law applicable concerning the possible ramification of an ever-
to cyber operations13 or the legal and ethical more digitalised battlefield in order to frame the
implications of the development and prospective emerging debate among scholars and political
use of lethal autonomous weapons systems decision-makers.
(LAWS).14 Without ignoring these more specific To that end, in addition to original research
questions, the present framing paper attempts to conducted by the author, the paper draws on the
zoom out and expand the scope of consideration. findings and discussions of an online expert
Taking a holistic perspective, its main focus are workshop conducted on 12 August 2021.15

10 13
International Committee of the Red Cross, ‘The Potential See only Schmitt MN (ed), Tallinn Manual 2.0 on the
Human Cost of Cyber Operations’ (2019) International Law Applicable to Cyber Operations (2017);
<[Link] Moynihan H, ‘The Application of International Law to State
cyber-operations>; Geiß R and Lahmann H, ‘Protecting Cyberattacks: Sovereignty and Non-Intervention’ (2019)
Societies - Anchoring A New Protection Dimension In <[Link]
International Law In Times Of Increased Cyber Threats’ ations/research/[Link]>;
(Geneva Academy of International Humanitarian Law and Delerue F, Cyber Operations and International Law (2020).
Human Rights 2021) <[Link] 14
See only Human Rights Watch, ‘Losing Humanity. The
[Link]/joomlatools-files/docman-files/working-
Case against Killer Robots’ (Human Rights Watch 2021)
papers/Protecting%20Societies%20-%[Link]>.
<[Link]
11
See Shereshevsky Y, ‘Are All Soldiers Created Equal? – On [Link]>; International Committee of the Red Cross,
the Equal Application of the Law to Enhanced Soldiers’ ‘Artificial Intelligence and Machine Learning in Armed
(2021) 61 Virginia Journal of International Law 271. Conflict: A Human-Centred Approach’ (2019)
12 <[Link]
See Dorsey J and Amaral N, ‘Military Drones in Europe:
and-machine-learning-armed-conflict-human-centred-
Ensuring Transparency and Accountability’ (Chatham
approach>.
House 2021)
<[Link] 15
The workshop participants were Greg Austin, Anja
04/[Link]>, Dahlmann, Kristen E. Eichensehr, Lindsay Freeman, Arthur
p. 7. Holland Michel, Asaf Lubin, Kubo Mačák, Rebecca Mignot-

2 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
more detail elsewhere,16 but as an integral
2. ELEMENTS OF THE FUTURE component of the emergent digitalisation of the
battlefield, it merits brief mention here. The
DIGITAL BATTLEFIELD: ways in which offensive cyber capabilities could
support future military operations are manifold.
TECHNOLOGY OVERVIEW While the employment of such tools might at
times aim at complementing kinetic resources, a
The following section describes some of the potentially more momentous shift, both from a
determining individual elements that jointly perspective of military strategy and from the
characterise the “future digital battlefield”. To humanitarian protection angle, is the significant
the extent that some of these technologies have expansion of possibilities to directly impact
previously been described in more detail adversarial states without any use of kinetic
elsewhere, they will only be addressed briefly. force at all.
What is important to note at the outset is that During the course of more traditional
whatever these technologies signify and imply in military operations, cyber capabilities might be
isolation, it is crucial to emphasise and analyse leveraged for destructive or disruptive effects as
the convergent effects of the employment of part of what has recently been dubbed “all-
these different technologies, and what these domain manoeuvre warfare”,17 i.e. the creation of
effects imply for the continuing ability of “decision advantage enhanced through (…)
existing legal frameworks to regulate or mitigate cyberspace to enable operations in the Ground,
possible adverse consequences for humanitarian Air, and Maritime Domains to deter and defeat”
protection. an adversary.18 This might include the
manipulation or disabling of the opponent’s
weapons systems by way of inserting malware,19
2.1 OFFENSIVE CYBER CAPABILITIES digitally attacking adversarial ISR systems or
stored intelligence data in order to thwart
There is broad consensus today that the
reconnaissance activities, or more generally
digital transformation of society of over the past
disrupting the adversary’s digital
three decades has also ushered in a new era of
infrastructures, as first on view in the armed
conflict with entirely new methods to carry out
conflict between Russia and Georgia in 2008.20
military operations. Most recently, the dawn of
One real-world example to mention in this
the age of cyber warfare has perhaps been the
context are the extensive cyber operations by the
most obvious and widely discussed aspect of this
U.S. and its allies in the fight against ISIS, not
paradigm shift. The author has addressed the
only digitally attacking the terrorist group’s
issue of the use of cyber means in the military in

Mahdavi, Nema Milaninia, Khadidja Nemar, Giacomo Persi manoeuvre-warfare>.


Paoli, Sasha Radin, Chiara Redaelli, Yahli Shereshevsky, 18
See <[Link]/Portals/36/Documents/Doc-
Talita de Souza Dias, and Noëlle van der Waag-Cowling.
trine/MECC2019/mecc2019day1brief6jointfuturescon-
16
See Geiß R and Lahmann H, ‘Protecting Societies - [Link]?ver=2019-10-17-143319-517>.
Anchoring A New Protection Dimension In International 19
Law In Times Of Increased Cyber Threats’ (Geneva Academy See Gady F-S, ‘What Does AI Mean for the Future of
of International Humanitarian Law and Human Rights Manoeuvre Warfare?’ (IISS, 5 May 2020)
2021) <[Link] <[Link]
files/docman-files/working- manoeuvre-warfare>.
papers/Protecting%20Societies%20-%[Link]>; 20
See Lawson E, ‘Into the Ether: Considering the Impact of
Geiß R and Lahmann H, ‘Protection of Data in Armed the Electromagnetic Environment and Cyberspace on the
Conflict’ (2021) 97 International Law Studies 556. Operating Environment’ in Peter Roberts (ed), The Future
17
See Gady F-S, ‘What Does AI Mean for the Future of Conflict Operating Environment Out to 2030 (RUSI 2019)
Manoeuvre Warfare?’ (IISS, 5 May 2020) <[Link]
<[Link] [Link]>, 56.

3 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
communication networks and devices that were number of countries.25 Such cases have been on
used for propaganda campaigns and recruiting the rise, perhaps even slowly starting to
efforts21 but even disrupting active drone supersede the targeting of more traditional
operations by the organisation.22 military objects,26 leading to the observation that
Perhaps even more far-reaching is the the conduct of conflict is slowly shifting towards
prospect of employing offensive cyber the coercion and control of civilian populations
capabilities to replace traditional military in adversarial states instead of attempting to
operations that rely on some sort of kinetic force. defeat the opposing military forces.27 Recent
Some of such operations might be launched with assessments have pointed out the many risks for
the intention to minimise risks of large-scale civilian persons and objects resulting from such
damage at the targeted adversarial objects and operations.28 In light of current capacities and
thus to decrease escalation risks. Operation strategies, it has been suggested that when it
Olympic Games, better known as the Stuxnet comes to the use of offensive military cyber
malware, deployed by the U.S. and Israel to technologies in conflict situations, the overall
sabotage Iranian uranium enrichment facilities picture resembles that of air warfare in 1914 –
in Natanz, may serve as an example for a cyber which implies that more large-scale, increasingly
operation that, while intentionally causing sophisticated operations with more destructive
physical damage, was arguably less destructive effects are to be expected in the coming one to
than had the militaries and intelligence agencies two decades.29
of the two states attempted to achieve the same
effects with kinetic weapons launched from
fighter jets or drones.23 At the same time, over the 2.2 INFORMATION WARFARE
past decade we have seen instances of disruptive
military cyber operations targeting critical Propaganda and other efforts to obtain
infrastructures in other states, such as the informational advantage over the opponent have
electrical grid,24 or against civilian assets that had always been part and parcel of military
serious, probably unintended effects in a large operations in and beyond armed conflict.

21
See Work J, ‘The American Way of Cyber Warfare infrastructure-russia-ukrainian-power-grid-attacks/>.
and the Case of ISIS’ (Atlantic Council, 17 September 2019) 25
See Greenberg A, ‘The Untold Story of NotPetya,
<[Link]
the Most Devastating Cyberattack in History’ [2018] Wired
atlanticist/the-american-way-of-cyber-warfare-and-the-
<[Link]
case-of-isis/>; Temple-Raston D, ‘How the U.S. Hacked ISIS’
ukraine-russia-code-crashed-the-world/>.
(NPR, 26 September 2019)
<[Link] 26
Lawson E and Mačák K, ‘Avoiding Civilian Harm
hacked-isis>; Bronk C and Anderson GS, ‘Encounter Battle: from Military Cyber Operations During Armed Conflict’
Engaging ISIL in Cyberspace’ (2017) 2 The Cyber Defense (International Committee of the Red Cross 2021), 34.
Review 93.
27
van der Waag-Cowling N, ‘Stepping into the
22
Warrell H, ‘UK Targeted ISIS Drones and Online Breach: Military Responses to Global Cyber Insecurity’
Servers in Cyber Attack’ Financial Times (7 February 2021) (ICRC Humanitarian Law & Policy, 17 June 2021)
<[Link] <[Link]
45a4f8854ac5>. cyber-insecurity/>.
23
See Zetter K, ‘NATO Researchers: Stuxnet Attack 28
See Lawson E and Mačák K, ‘Avoiding Civilian
on Iran Was Illegal “Act of Force”’ (Wired, 25 March 2013) Harm from Military Cyber Operations During Armed
<[Link] Conflict’ (International Committee of the Red Cross 2021);
force/>. van der Waag-Cowling N, ‘Stepping into the Breach:
24
See Park D and Walstrom M, ‘Cyberattack on Military Responses to Global Cyber Insecurity’ (ICRC
Critical Infrastructure: Russia and the Ukrainian Power Humanitarian Law & Policy, 17 June 2021)
Grid Attacks’ (The Henry M. Jackson School of International <[Link]
Studies, 11 October 2017) cyber-insecurity/>.
<[Link] 29
Expert assessment during workshop

4 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
However, the internet and other networked sense AI can be understood as “a ‘constellation’ of
digital technologies have vastly expanded the processes and technologies enabling computers
possibilities to manipulate the information to complement or replace specific tasks
ecosystem to the detriment of the opponent, otherwise performed by humans, such as
with a wide range of potential ramifications for making decisions or solving problems”.33 One
humanitarian protection, as already explored in distinction often made in this context is between
detail in a previous paper.30 Leveraging social “general AI”, a supposedly highly intelligent
media and other channels of digital form of processing capable of fulfilling a great
communication, militaries are now able to carry number of different tasks that approaches
out complex information operations to deceive, human-level cognitive abilities – a technology
influence, or coerce members of adversarial that does not yet, and indeed might well never,
armed forces. More importantly, in a similar way exist – and “narrow artificial intelligence”, which
as certain cyber capabilities have started to be refers to computer systems that are able to
used, the spread of false and otherwise “perform programmed tasks (human-developed
misleading information may be employed to algorithms) in specific domains”.34 As a sub-
directly impact the civilian population in category of narrow AI, “machine learning”
another country for strategic gain, achieving describes the currently prevalent method of
political outcomes that hitherto required the training algorithms. Systems relying on this
unleashing of kinetic force. This is not at all technology are trained on vast amounts of data
necessarily deescalatory: In certain that allow them to build their own models to
circumstances, the deployment of such tools can effect certain outcomes – i.e. to make predictions
result in heightened tension or even intra- – instead of operating on the processing of pre-
communal violence among the target programmed rules, as the previous AI paradigm
population.31 had envisioned. This means that the output
depends on a number of variant and
interdependent factors, such as the type of
2.3 ARTIFICIAL INTELLIGENCE learning process and the resulting model, which
is a function of the data with which the
The most fundamental shifts in military algorithm is fed. On of the inherent features of
activities on both the strategic and operational this approach is that a human operator has only
level as part of the larger “future digital limited insight into the exact mechanism of
battlefield” are expected to be enabled by the learning, which makes the outcome of the
continuing progress of technologies that utilise operation unpredictable at least to some degree,
machine-learning algorithms (ML) and other depending on the circumstances of a given
forms of what is commonly described as artificial situation and environment.35
intelligence (AI). While there is no generally As an all-purpose technology not unlike
recognised definition of AI,32 in its most general

30
Geiß R and Lahmann H, ‘Protecting the Global Elements to Consider’ (ICRC Humanitarian Law & Policy,
Information Space in Times of Armed Conflict’ (Geneva 21 March 2019) <[Link]
Academy of International Humanitarian Law and Human policy/2019/03/21/legal-reviews-weapons-means-methods-
Rights 2021) <[Link] warfare-artificial-intelligence-16-elements-consider/>.
[Link]/joomlatools-files/docman-files/working- 33
Report of the Special Rapporteur on the Promotion and
papers/Protecting%20the%20Global%20information%20s
Protection of the Right to Freedom of Opinion and
pace%20in%20times%20of%20armed%[Link]>.
Expression, UN Doc. A/73/348 (29 August 2018), p. 3.
31
International Committee of the Red Cross, ‘International 34
Id., 4.
Humanitarian Law and the Challenges of Contemporary
Armed Conflicts’ (International Committee of the Red Cross 35
International Committee of the Red Cross, ‘Autonomy,
2019), 28-29. Artificial Intelligence and Robotics: Technical Aspects of
32 Human Control’ (2019), p. 14-15.
Lewis DA, ‘Legal Reviews of Weapons, Means and
Methods of Warfare Involving Artificial Intelligence: 16

5 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
electricity or network communications, it makes Committee of the Red Cross, a LAWS is a system
little sense to approach AI as one uniform “that has autonomy in its ‘critical functions’,
development that is responsive to one-size-fits- meaning a weapon that can select (i.e. search for
all regulatory or policy approaches. Therefore, or detect, identify, track) and attack (i.e.
the following sections attempt to outline the “AI intercept, use force against, neutralise, damage or
revolution” in the military by highlighting some destroy) targets without human intervention”.36
of the individual subject areas where the While quite a few states are developing such
technology promises to prove the most autonomous weapons or have at least expressed
momentous and far-reaching in light of the intent to do so in the future,37 to date, real-
humanitarian concerns. This means that uses of world cases of the actual deployment of such
AI that will become of increasing significance for systems remain few and far between. In March
the internal organisation of armed forces, such as 2021, a report by the Panel of Experts on Libya
employing ML algorithms to optimise addressed to the UN Security Council drew
maintenance cycles for military equipment, are considerable attention by pointing to the
not addressed. Furthermore, it is important to purported use of the Turkish-built autonomous
note that not all of the subsections are, strictly lethal drones “STM Kargu-2” in a combat
speaking, analytically separate. Autonomous situation between warring factions of the Libyan
unmanned aerial vehicles (UAVs) (2.3.2) may civil war. According to the report, the unmanned
have the capability to be employed as lethal aerial vehicle had been “programmed to attack
autonomous weapons systems (LAWS) (2.3.1) or targets without requiring data connectivity
be used for intelligence, surveillance, and between the operator and the munition: in effect,
reconnaissance (ISR) tasks (2.3.3) or targeting a true ‘fire, forget and find’ capability”.38 At the
(2.3.4). Battlefield command & control as well as same time, other experts cast doubt on the
AI-supported targeting will depend on ISR and significance of the finding, cautioning that the
may utilise autonomous cyber tools (2.3.5), and report had in fact not made clear whether the
so on. drone had acted in a truly autonomous fashion
when engaging its target.39
Generally speaking, observers have pointed
2.3.1 LETHAL AUTONOMOUS WEAPONS out that in the context of LAWS, “autonomy”
SYSTEMS will remain a relative concept no matter the
actual AI capabilities of the system. As noted by
The proliferation of AI systems in the military Paul Scharre, no weapon “will be ‘fully
has to date been most elaborately discussed in autonomous’ in the sense of being able to
the context of the development and possible perform all possible military tasks on its own.
deployment of so-called lethal autonomous Even a system operating in a communications-
weapons systems. According to the International denied environment will still be bounded in
terms of what it is allowed to do. Humans will

36
International Committee of the Red Cross, ‘Autonomous killer-robots/country-positions-banning-fully-
Weapon Systems: Is It Morally Acceptable for a Machine to autonomous-weapons-and#>.
Make Life and Death Decisions?’ (ICRC, 13 April 2015) 38
UN Security Council, ‘Letter Dated 8 March 2021 from the
<[Link]
Panel of Experts on Libya Established Pursuant to
weapons-systems-LAWS>.
Resolution 1973 (2011) Addressed to the President of the
37
In a recent report, Human Rights Watch listed China, Security Council’ (UN Security Council 2021) S/2021/229
Israel, Russia, South Korea, the United Kingdom, and the <[Link] para. 63.
United States as investing “heavily” in the development, 39
See Cramer M, ‘A.I. Drone May Have Acted on Its Own in
and Australia, Turkey, as well as other countries as “making
Attacking Fighters, U.N. Says’ The New York Times (3 June
investments”, see Wareham M, ‘Stopping Killer Robots.
2021)
Country Positions on Banning Fully Autonomous Weapons
<[Link]
and Retaining Human Control’ (Human Rights Watch
[Link]>.
2020) <[Link]

6 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
still set the parameters for operation and will is the employment of machine-learning
deploy military systems, choosing the mission algorithms in unmanned aerial vehicles, where it
they are to perform.”40 Despite this reservation, can be used for a variety of tasks such as
the subject of LAWS and in particular the autonomous navigation or surveillance
question how much “meaningful human activities that require only minimum human
control” must be retained for such a system to be intervention.42 Generally speaking, drones have
ethically and legally justifiable remains one of become more and more important for military
the most hotly debated issues in the context of strategy over the past decade, not only as a
the use of AI in military applications and the preferred tool to conduct the “war on terror” but
future digital battlefield more generally, both in also more recently in the international armed
academic environments and among states. Since conflict between Azerbaijan and Armenia.43
2017, with a mandate from the Certain Potentially even more momentous is the
Conventional Weapons Meeting of High development of robotic swarms that may consist
Contracting Parties, a Group of Governmental of a large number of UAVs or other (e.g. land- or
Experts (GGE) has been attempting to grapple sea-based) systems.44 Although not yet
with “questions related to emerging technologies operational, the technology is set to add a further
in the area of lethal autonomous weapons unprecedented layer of complexity to the future
systems”. In 2019, the GGE published “11 conduct of armed conflict. At its most basic
Guiding Principles on LAWS”.41 conception, swarms may be defined as “multi-
robot systems within which robots coordinate
their actions to work collectively towards the
2.3.2 UNMANNED AERIAL VEHICLES AND DRONE execution of a goal”.45 Crucially, swarming
properly understood implies that the entity
SWARMS taken as a whole is more than the sum of its parts;
not only would the individual robots that make
While the public debate on AI in the military
up the swarm not be capable of fulfilling the
has been focusing on the issue of LAWS, the
assigned tasks on their own, some complex
technology might be more useful in the short to
effects of operating swarms would be
mid term – and indeed become ubiquitous soon
inconceivable without the intricate – and to
– in regard to applications that have nothing to
some degree unforeseeable – dynamics that
do with autonomously engaging targets. One of
emerge when the individual robots coordinate
the most important subject areas in this context
with each other autonomously, trying to find the

40
Scharre P, ‘Between a Roomba and a Terminator: What Is karabakh-future-conflict-drones/>; Gady F-S and Stronell A,
Autonomy?’ (War on the Rocks, 18 February 2015) ‘What the Nagorno-Karabakh Conflict Revealed About
<[Link] Future Warfighting’ (World Politics Review, 19 November
and-a-terminator-what-is-autonomy/>. 2020)
41
<[Link]
See United Nations Office for Disarmament Affairs,
t-the-nagorno-karabakh-conflict-revealed-about-future-
Background on LAWS in the CCW,
warfighting>.
<[Link]
44
certain-conventional-weapons/background-on-laws-in-the- See e.g. Royal Marines, ‘Drone Swarms Support
ccw/>. Commando Forces Trials in a First for the UK’s Armed
42
Forces’ (Royal Navy, 17 July 2021)
See Dorsey J and Amaral N, ‘Military Drones in Europe:
<[Link]
Ensuring Transparency and Accountability’ (Chatham
activity/news/2021/july/17/210715-autonomous-advance-
House 2021)
force-4>.
<[Link]
45
04/[Link]>, Ekelhof M and Persi Paoli G, ‘Swarm Robotics: Technical
p. 15-16. and Operational Overview of the Next Generation of
43
Autonomous Systems’ (United Nations Institute for
See Crabtree J, ‘Gaza and Nagorno-Karabakh Were
Disarmament Research 2020), p. 24.
Glimpses of the Future of Conflict’ [2021] Foreign Policy
<[Link]

7 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
most efficient way to complete a mission by Aside from the issue of the possibility of
sharing resources and flexibly dividing tasks. human control, the very architecture of
This “swarm intelligence” may allow for decentralised, autonomous robotic swarms
increased coordination and speed in conflict requires a highly reliable communications
situations.46 The possible uses of swarms for the infrastructure in order to enable both
military are manifold and include “intelligence, coordination between the individual
surveillance and reconnaissance operations; autonomous entities and human command and
perimeter surveillance and protection; control. This necessarily means that swarms are
distributed attacks; overwhelming enemy air inherently vulnerable to outside interference by
defences; force protection; deception; search and means of “jamming, spoofing, hacking,
rescue operations; countering swarms; and dull, hijacking, manipulation or other electronic
dirty and dangerous tasks”.47 warfare attacks”.50 From a legal perspective, this
Swarms pose intriguing questions about the implies an increased responsibility for armed
feasibility and modelling of human control. forces employing drone swarms to secure such
While every single unit within a swarm acts systems and ensure the ability to intervene in
autonomously, i.e. according to its own case a swarm starts behaving erratically and
algorithmic setup, the swarm is itself an potentially dangerously for protected persons
autonomous entity, encompassing the totality of and objects, which will be addressed in more
the decentralised, autonomous decisions of each detail in Part 3 below.
robotic entity. Therefore, human operators can As mentioned, despite some announcements
only meaningfully exercise control over the to the contrary,51 swarming technology properly
entire swarm but not its constituent parts. In understood has not yet been realised. While
light its inherent complexity that may result in there have been reports that during its military
so-called “emergent behaviours” – i.e. behaviour campaign in Gaza in May 2021, the Israel Defence
that only occurs after the testing phase during Forces became the first military to deploy drone
actual missions – however, some experts have swarms for purposes of ISR in a combat
raised doubts as to the predictability and thus scenario,52 it is questionable whether the large
controllability of robotic swarming behaviour.48 groups of small UAVs were in fact displaying
Others have argued that appropriate “design and true swarming capabilities by autonomously
modelling approaches” might nonetheless be communicating and coordinating with each
capable of enabling proper human control.49 other.53

46
Scharre P, ‘Unleash the Swarm: The Future of Warfare’ Disarmament Research 2020), p. 54.
(War on the Rocks, 4 March 2015) 51
Ekelhof M and Persi Paoli G, ‘Swarm Robotics: Technical
<[Link]
and Operational Overview of the Next Generation of
the-future-of-warfare/>.
Autonomous Systems’ (United Nations Institute for
47
Ekelhof M and Persi Paoli G, ‘Swarm Robotics: Technical Disarmament Research 2020), p. 54.
and Operational Overview of the Next Generation of 52
See Makewar A, ‘Israel Used First-Ever AI-Guided Combat
Autonomous Systems’ (United Nations Institute for
Drone Swarm in Gaza Attacks’ (6 July 2021)
Disarmament Research 2020), p.1.
<[Link]
48
See Holland Michel A, ‘Known Unknowns: Data Issues guided-combat-drone-swarm-gaza-attacks/19940/>; Dunhill
and Military Autonomous Systems’ (United Nations J, ‘First “AI War”: Israel Used World’s First AI-Guided
Institute for Disarmament Research 2021), p. 19. Swarm Of Combat Drones In Gaza Attacks’ IFL Science (2
49
July 2021) <[Link]
Ekelhof M and Persi Paoli G, ‘Swarm Robotics: Technical
war-israel-used-worlds-first-aiguided-swarm-of-combat-
and Operational Overview of the Next Generation of
drones-in-gaza-attacks/>.
Autonomous Systems’ (United Nations Institute for
53
Disarmament Research 2020), p. 55. See Gross JA, ‘In Apparent World First, IDF Deployed
50
Drone Swarms in Gaza Fighting’ The Times of Israel (10 July
Ekelhof M and Persi Paoli G, ‘Swarm Robotics: Technical
2021) <[Link]
and Operational Overview of the Next Generation of
first-idf-deployed-drone-swarms-in-gaza-fighting/>; Dunhill
Autonomous Systems’ (United Nations Institute for
J, ‘First “AI War”: Israel Used World’s First AI-Guided

8 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
2.3.3 INTELLIGENCE, SURVEILLANCE AND with visual or audiovisual feeds gained from a
variety of sensors installed on satellites in
RECONNAISSANCE (ISR) AND FUSION geostationary or low Earth orbit57 or UAV
ARCHITECTURES platforms that autonomously operate in conflict
zones or elsewhere, able to cover a wide span of
When military decision-makers contemplate territory,58 as well as the rapidly growing number
on the omnibus concept of the “future digital bat- of internet-of-things (IoT) devices that effectively
tlefield”, a cornerstone of any evolving strategy is act as remote sensors. As an expert recently noted
the large-scale use of AI for purposes of succinctly, “pretty much everything is going to
intelligence, surveillance, and reconnaissance be connected; all things are potential sources of
activities, an area that is particularly suitable to information”.59 To provide one example, through
take advantage of the capabilities of machine- drone surveillance activities conducted as part of
learning technologies. Since the onset of the its global counterterrorism operations, in 2017
global “war on terror” prompted the sweeping alone, U.S. Central Command reportedly
expansion of intelligence activities on collected 700,000 hours, or 80 years, of full-
information and telecommunications network motion video material.60 Naturally, to
infrastructures to carry out wide-reaching and meaningfully examine such amounts of raw data
constant surveillance that became the defining is beyond any human analyst’s capabilities;61
feature of transnational counterterrorism,54 the only AI systems using machine-learning
amount of data recording the behaviour of algorithms can feasibly parse through such
individuals gathered by states’ security massive troves of big data and look for
apparatuses has grown exponentially, with the conspicuous behavioural patterns or trends that
result that no human can realistically take stock may support more efficient and faster decision-
of, let alone analyse, the available information.55 making in conflict settings, both on an
Today, the vast amounts of data obtained from operational and a strategic level, and increase
citizens’ online communication or social media situational awareness on the battlefield.62
activities56 is complemented by and combined

Swarm Of Combat Drones In Gaza Attacks’ IFL Science (2 Operations Forces, Project Maven, and Twenty-First
July 2021) <[Link] Century Warfare’ (Modern War Institute, 25 August 2020)
war-israel-used-worlds-first-aiguided-swarm-of-combat- <[Link]
drones-in-gaza-attacks/>. forces-project-maven-and-twenty-first-century-warfare/>.
54 59
Margulies J, ‘9/11 Forever’ [2021] The Boston Review Bayley J, ‘Transforming ISR Capabilities through AI,
<[Link] Machine Learning and Big Data: Insights from Dr. Thomas
911-forever>; Bhuta N and Mignot-Mahdavi R, ‘Dangerous Killion, Chief Scientist, NATO’ (Defence IQ, 30 July 2018)
Proportions: Means and Ends in Non-Finite War’ (2021) <[Link]
Asser Research Paper 2021-01, p. 22. technology/news/transforming-isr-capabilities-through-ai-
55
machine-learning-and-big-data>.
See Frisk A, ‘What Is Project Maven? The Pentagon AI
60
Project Google Employees Want out Of’ (Global News, 5 Schultz RH and Clarke RD, ‘Big Data at War: Special
April 2018) <[Link] Operations Forces, Project Maven, and Twenty-First
pentagon-ai-project-maven/>. Century Warfare’ (Modern War Institute, 25 August 2020)
56
<[Link]
Smagh NS, ‘Intelligence, Surveillance, and
forces-project-maven-and-twenty-first-century-warfare/>.
Reconnaissance Design for Great Power Competition’
61
(Congressional Research Service 2020) R46389 Id.
<[Link] p. 16. 62
See Konaev M, ‘With AI, We’ll See Faster Fights, But
57
Stacey E, ‘Future Warfighting in the 2030s: An Interview Longer Wars’ (War on the Rocks, 29 October 2019)
with Franz-Stefan Gady’ (Strife, 9 September 2020) <[Link]
<[Link] faster-fights-but-longer-wars/>; Bayley J, ‘Transforming ISR
warfighting-in-the-2030s-an-interview-with-franz-stefan- Capabilities through AI, Machine Learning and Big Data:
gady/>. Insights from Dr. Thomas Killion, Chief Scientist, NATO’
58
(Defence IQ, 30 July 2018)
Schultz RH and Clarke RD, ‘Big Data at War: Special

9 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
The cutting edge of these advanced AI- Maven” (aka “Algorithmic Warfare Cross-
supported ISR systems are so-called platform- Function Team”), launched in 2017,64 which was
independent fusion architectures that harvest recently incorporated into the “Advanced Battle
data gathered by means of the latest high-fidelity Management System (ABMS)” of the U.S. Air
sensor technologies across various platforms (air, Force as part of the development of a
space, ground assets) and numerous further comprehensive “Joint All Domain Command
sources (including social media activity, phone and Control (JADC2)”.65 It is conceived as a
records, public administrative data about “network-of-networks that aims to link ‘every
individuals or groups, and other publicly sensor to every shooter’ across air, land, sea, space
available open-source intelligence (OSINT) and cyber”.66 According to recent reports, the EU
datasets), amalgamating vast swathes of has started to consider funding the development
unstructured data.63 Experts expect this new of such technologies as well.67 While more
generation of ISR to revolutionise military advanced fusion architectures are still in the
command and control, enabling the planning stage,68 at least “Project Maven” has
implementation of “battlefield management reportedly already supported U.S.
systems” that provide military commanders counterterrorism missions in the Middle East.69
with an autonomously and dynamically So far, the system is limited to the ability to assist
analysed and prioritised, comprehensive human operators to process the large quantities
operating picture in the field, with the potential of incoming data, lacking the sophistication to
to be accessible to all deployed personnel and provide autonomously generated deductions
thus to make decision-making during military that adequately take account of the broader
operations both faster and more reliable. context.70 Another long-term U.S.-launched
Ongoing projects that attempt to build large- endeavour in this area is “Sentient”,71 an
scale systems capable of integrating these “artificial brain” mainly utilising geospatial data
different types of data streams and providing gathered from satellites and other sources to
real-time AI-based analysis include “Project detect pattern anomalies that can predict “model

<[Link] Ensuring Transparency and Accountability’ (Chatham


technology/news/transforming-isr-capabilities-through-ai- House 2021)
machine-learning-and-big-data>; Gady F-S, ‘What Does AI <[Link]
Mean for the Future of Manoeuvre Warfare?’ (IISS, 5 May 04/[Link]>,
2020) <[Link] p. 15-16.
manoeuvre-warfare>. 68
Stacey E, ‘Future Warfighting in the 2030s: An Interview
63
Holland Michel A, ‘There Are Spying Eyes Everywhere – with Franz-Stefan Gady’ (Strife, 9 September 2020)
And Now They Share a Brain’ [2021] Wired <[Link]
<[Link] warfighting-in-the-2030s-an-interview-with-franz-stefan-
everywhere-and-now-they-share-a-brain/>. gady/>.
64 69
Frisk A, ‘What Is Project Maven? The Pentagon AI Project Gady F-S, ‘What Does AI Mean for the Future of
Google Employees Want out Of’ (Global News, 5 April 2018) Manoeuvre Warfare?’ (IISS, 5 May 2020)
<[Link] <[Link]
project-maven/>. manoeuvre-warfare>.
65 70
Barnett J, ‘Latest ABMS Tests Break New Barriers on AI and Horowitz MC and others, ‘Artificial Intelligence and
Edge Cloud Capabilities’ (FedScoop, 18 March 2021) International Security’ (Center for a New American Security
<[Link] 2018)
cybersecurity/>. <[Link]
66 [Link]>, p. 9.
Barnett J, ‘Air Force Moving Project Maven into Advanced
Battle Management System Portfolio’ (FedScoop, 10 August 71
National Reconnaissance Office, ‘NRO Key Talking Points:
2020) <[Link] Sentient’ (September 2016)
forces-advanced-battle-management-system/>. <[Link]
67
Dorsey J and Amaral N, ‘Military Drones in Europe: orAll/051719/[Link]>.

10 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
adversaries’ potential courses of action”.72 2.3.4 TARGETING
Despite the early stage of development of such
architectures, experts predict that in the future, Naturally, the development of both LAWS
the fusion approach will lead to AI-enabled and fusion architectures for ISR underlines the
recommender systems technology that is able to relevance that military decision-makers envision
“propose courses of action based on real-time for AI system when it comes to the process of
analysis of the battlespace (as opposed to past targeting during military operations. Having
behavior which in complex systems may not machine-learning algorithms take decisive steps
predict future behavior)”.73 as part of the “kill chain”, i.e. that directly lead to
Reporting on the recent campaign launched engaging a military objective, goes beyond
by the Israel Defence Forces against Palestinian “mere” ISR, even if the latter might lead up to a
armed groups based in Gaza suggests that it may targeting decision, while falling short of actually
have been the first armed conflict in which one using force autonomously. This type of task can
side directly benefited from the employment of involve different algorithmic activities by the
AI-supported ISR that successfully fused data employed system, depending on the technology
from different sources such as signals and the situation. It was recently reported that
intelligence, visual intelligence, human the latest iteration of the U.S. Air Force’s ABMS is
intelligence, and geospatial intelligence in order now capable of “directly aid[ing] in zeroing in on
to generate recommendations for targets such as a target”, which was considered a serious
rocket launchpads or groups of combatants in breakthrough.77 Similarly noteworthy are
real time, and even to send out warnings of accounts of the assassination of an Iranian
possible attacks against IDF units to tablets nuclear scientist near Tehran in November 2020,
provided to commanders in the field.74 However, in which Israeli intelligence apparently
it bears noting that this information has not been employed a facial recognition system to identify
confirmed by sources independent of the IDF its target immediately prior to the strike, and a
itself.75 Either way, at least spokespeople for remotely controlled machine gun that used an AI
Israeli military went as far as calling the conflict system to compensate for the delay in satellite
the “first AI war”.76 communication between weapon and human
operator.78 In both scenarios, it is obvious that

72
Scoles S, ‘It’s Sentient: Meet the Classified Artificial Brain Gives Glimpse of Future’ Nikkei Asia (28 June 2021)
Being Developed by US Intelligence Programs’ [2019] The <[Link]
Verge relations/The-first-AI-conflict-Israel-s-Gaza-operation-
<[Link] gives-glimpse-of-future>.
national-reconnaissance-office-spy-satellites-artificial- 75
See Crabtree J, ‘Gaza and Nagorno-Karabakh Were
intelligence-ai>.
Glimpses of the Future of Conflict’ [2021] Foreign Policy
73
Konaev M, ‘With AI, We’ll See Faster Fights, But Longer <[Link]
Wars’ (War on the Rocks, 29 October 2019) karabakh-future-conflict-drones/>.
<[Link] 76
Gross JA, ‘IDF Intelligence Hails Tactical Win in Gaza,
faster-fights-but-longer-wars/>.
Can’t Say How Long Calm Will Last’ The Times of Israel (27
74
See Ahronheim A, ‘Israel’s Operation against Hamas Was May 2021) <[Link]
the World’s First AI War’ The Jerusalem Post (27 May 2021) tactical-win-over-hamas-but-cant-say-how-long-calm-will-
<[Link] last/>.
news/guardian-of-the-walls-the-first-ai-war-669371>; Ben- 77
Barnett J, ‘Latest ABMS Tests Break New Barriers on AI and
Yishai R, ‘How Data and AI Drove the IDF Operation in
Edge Cloud Capabilities’ (FedScoop, 18 March 2021)
Gaza’ YNet News (29 May 2021)
<[Link]
<[Link]
cybersecurity/>.
>; Dar Y, ‘Israel Says It Fought World’s First “Artificial
Intelligence War” Against Hamas’ The Eurasian Times (29 78
See Bergman R and Fassihi F, ‘The Scientist and the A.I.-
May 2021) <[Link] Assisted, Remote-Control Killing Machine’ The New York
worlds-first-artificial-intelligence-war-against-hamas/>; Times (18 September 2021)
Kumon T, ‘The First AI Conflict? Israel’s Gaza Operation

11 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
even though the ultimate act of pulling the be very useful to scan and surveil the online
trigger, and thus to take the final decision to activities of vast numbers of individuals81 or to
employ force, remains with a human – and a lot autonomously “prepare the digital battlefield”
depends on the original mission design and the by planting malware in an adversary’s networks
prior planning of the operation – the AI system that might be activated remotely in the case a
was charged with carrying out a substantial conflict breaks out.82 In this context, it is
proportion of the critical process. important to note that both the increased
employment of machine-learning algorithms
and the need to guarantee stable communication
2.3.5 CYBER WARFARE AND AI links between the different systems and
components that constitute the “digital
The application of machine-learning battlefield” greatly increases the attack surface,
algorithms and other types of AI to cyber rendering the entire ecosystem much more
operations has already begun, and it presents susceptible to adversarial cyber conduct.83 The
obvious advantages for military conduct in implications of this will be addressed again in
cyberspace.79 For instance, the employment of part 3.
machine learning greatly increases the chances
of discovering vulnerabilities in code that the AI
software could then exploit autonomously, 2.3.6 INFORMATION WARFARE AND AI
which potentially leads to greater efficiency and
speed of offensive cyber operations. To be sure, Finally, the employment of artificial
the same methods might be used to develop intelligence has already proven to greatly
stronger defensive systems that are capable of increase the potential impact of disinformation
automatically fending off malware or other campaigns and other types of information
adversarial intrusions into networks.80 warfare,84 enabling any such efforts to become
Furthermore, machine-learning algorithms can “more efficient, scalable, and widespread”.85

83
<[Link] See generally Herpig S, ‘Securing Artificial Intelligence.
[Link]>. Part 1: The Attack Surface of Machine Learning and Its
79
Implications’ (Stiftung Neue Verantwortung 2019)
See Thornton R and Miron M, ‘The Advent of the “Third
<[Link]
Revolution in Military Affairs”; Is the UK Now Facing the
[Link]/sites/default/files/[Link]>;
Threat of AI-Enabled Cyber Warfare(?)’ (Defence-In-Depth,
Ekelhof M and Persi Paoli G, ‘Swarm Robotics: Technical
21 July 2020) <[Link]
and Operational Overview of the Next Generation of
advent-of-the-third-revolution-in-military-affairs-is-the-uk-
Autonomous Systems’ (United Nations Institute for
now-facing-the-threat-of-ai-enabled-cyber-warfare/>; van
Disarmament Research 2020), p. 54; Holland Michel A,
der Waag-Cowling N, ‘Stepping into the Breach: Military
‘Known Unknowns: Data Issues and Military Autonomous
Responses to Global Cyber Insecurity’ (ICRC Humanitarian
Systems’ (United Nations Institute for Disarmament
Law & Policy, 17 June 2021) <[Link]
policy/2021/06/17/military-cyber-insecurity/>. Research 2021), p. 7; Lawson E and Mačák K, ‘Avoiding
Civilian Harm from Military Cyber Operations During
80
See International Committee of the Red Cross, Armed Conflict’ (International Committee of the Red Cross
‘International Humanitarian Law and the Challenges of 2021), p. 32.
Contemporary Armed Conflicts’ (International Committee 84
of the Red Cross 2019), p. 31. See International Committee of the Red Cross,
‘International Humanitarian Law and the Challenges of
81
See Smagh NS, ‘Intelligence, Surveillance, and Contemporary Armed Conflicts’ (International Committee
Reconnaissance Design for Great Power Competition’ of the Red Cross 2019), p. 31.
(Congressional Research Service 2020) R46389 85
<[Link] p. 16. Horowitz MC and others, ‘Artificial Intelligence and
International Security’ (Center for a New American Security
82
See Buchanan B and Cunningham FS, ‘Preparing the Cyber 2018)
Battlefield: Assessing a Novel Escalation Risk in a Sino- <[Link]
American Crisis’ (2020) 3 Texas National Security Review [Link]>, p. 5-6.
54.

12 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
Possible use cases include the automatic corresponding advances in robotics and sensor
generation of text that can easily be distributed technologies. While sensors are required to
digitally to disseminate false or misleading produce much of the data that can then be
information,86 the creation of deep-fake analysed and exploited for further ends in the
audiovisual content, the detection of rifts in a theatre of conflict, that is for the “digital”
target population’s social fabric to maximise a battlefield to emerge in the first place, many
campaign’s impact, the deployment of bots to applications that utilise that data depend on
artificially amplify subversive messaging complex robotic systems to execute the tasks
directed at a target population, and to pursue that follow from data analysis. In the legal and
automated agenda setting – in particular with policy literature, however, there is a general
more sophisticated bots that are programmed to tendency to consider the implications of
credibly mimic real people’s behaviour87 – or the advanced robotics as a sub-category of the larger
automatic calibration of micro-targeting topic of autonomous weapons systems – even
methods in order to tailor content to receptive though the majority of robots will carry our tasks
audiences. Machine-learning algorithms might that are merely automated and do not require
automatically scrape social media to compile any degree of autonomy as properly
massive sets of users’ personal behavioural data understood.90
that can then be analysed to better understand While the processing power behind sensors
local populations for the purpose of enabling mounted on UAVs, planes, ships, submarines,
algorithmic predictions as to the type of content satellites, or ground-based vehicles remains
that should be used for an operation.88 crucial to properly take advantage of the tons of
Potentially even more far-reaching, according to data produced by current ISR architectures,
reports, recent advances in natural language recent progress in sensor technology itself plays
processing could even “leverage sentiment an important part in the development of the
analysis to target specific ideological “digital battlefield”. Aside from more established
audiences”.89 sensors such as radar, sonar, video, infrared, and
passive RF detection,91 examples for the latest
available technologies include infrared scan and
2.4 ROBOTICS AND SENSOR TECHNOLOGIES track (IRST) sensors that work with super-cooled
lenses “to search for and classify incredibly faint
The digital revolution of military activities heat sources at long range”92 or ground-based
could not fulfil its momentous potential without

86
Villasenor J, ‘How to Deal with AI-Enabled International Security’ (Center for a New American Security
Disinformation’ (Brookings, 23 November 2020) 2018)
<[Link] <[Link]
enabled-disinformation/>. [Link]>, p. 5-6.
87
Horowitz MC and others, ‘Artificial Intelligence and 90
See e.g. Winkler JD and others, ‘Reflections on the Future
International Security’ (Center for a New American Security of Warfare and Implications for Personnel Policies of the
2018) U.S. Department of Defense’ (RAND Corporation 2019), p.
<[Link] 14-16.
[Link]>, p. 5-6. 91
Zheng DE and Carter WA, ‘Leveraging the Internet of
88
Jensen BM, Whyte C and Cuomo S, ‘Algorithms at War: Things for a More Efficient and Effective Military’ (Center
The Promise, Peril, and Limits of Artificial Intelligence’ for Strategic & International Studies 2015) <[Link]
(2020) 22 International Studies Review 526, 532-33; [Link]/s3fs-
Horowitz MC and others, ‘Artificial Intelligence and public/legacyfiles/files/publication/150915ZhengLeveragi
International Security’ (Center for a New American Security [Link]>, 14.
2018)
92
<[Link] Bronk J, ‘Technological Trends’ in Peter Roberts (ed), The
[Link]>, p. 5-6. Future Conflict Operating Environment Out to 2030 (RUSI
2019)
89
Horowitz MC and others, ‘Artificial Intelligence and <[Link]

13 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
multi-static passive radars that are able to detect 2.5 SPACE TECHNOLOGIES
“echoes in the background electromagnetic
‘noise’ of mobile-phone, television and radio A further essential aspect of the ongoing
transmissions (among others) to track aircraft digitalisation of the battlefield is the increasing
without needing a primary radar emitter”,93 relevance of space assets as the backbone of the
which might soon pose a problem to fighter jets information and communications
using current stealth technologies. Other recent infrastructures that are needed to implement the
and noteworthy developments are lasers capable conception of a constantly interconnected,
of identifying individuals over long distances by responsive, and cross-domain “battle network
measuring their heartbeat or the remote architecture” which links members of the armed
utilisation of Bluetooth signals emitted by forces with sensors, data processing systems, and
phones and other devices, or computer vision autonomously operating machines.96 By itself,
systems that can detect suspicious movements.94 the utilisation of satellites for communication,
To be sure, these developments in sensor ISR, missile warning, and positioning,
technologies and the latest breakthroughs in navigation, and timing (PNT) is nothing new,
artificial intelligence and machine learning as having been around for decades.97 Especially the
well as computers’ increasing processing powers U.S., China, and Russia have long-established
are directly interrelated. As pointed out by infrastructures in space for these purposes.98
Bronk, “the extremely faint nature of the signals However, building a networked, digital
which are being tracked and huge number of combat infrastructure that relies on the
false-positive readings and background clutter of continuous gathering, processing, and
one sort or another means that their practicality transmission of large amounts of data, for
as operational tools is linked directly to the post- example for AI-supported command-and-control
processing hardware and software available to or remotely controlled means of warfare such as
refine the raw sensor data into a usable picture”.95 armed drones, puts new emphasis on the
The technological progress of the different sophistication and complexity of space assets.99
elements goes hand in hand and is deeply According to experts, a number of different types
interdependent. of military offensive cyber operations also
depend on satellite-supported networks.100 For
this reason, several states have begun to develop
and already deploy a new generation of space

[Link]>, p. 61-62. <[Link]


93
c-sector/[Link]>.
Bronk J, ‘Technological Trends’ in Peter Roberts (ed), The
97
Future Conflict Operating Environment Out to 2030 (RUSI Defense Intelligence Agency, ‘Challenges to Security in
2019) Space’ (2019)
<[Link] <[Link]
[Link]>, p. 61-62. y%20Power%20Publications/SpaceThreatV14020119sm.p
df>, p.8.
94
Holland Michel A, ‘There Are Spying Eyes Everywhere –
98
And Now They Share a Brain’ [2021] Wired See id.
<[Link] 99
See International Committee of the Red Cross,
everywhere-and-now-they-share-a-brain/>. ‘International Humanitarian Law and the Challenges of
95
Bronk J, ‘Technological Trends’ in Peter Roberts (ed), The Contemporary Armed Conflicts’ (International Committee
Future Conflict Operating Environment Out to 2030 (RUSI of the Red Cross 2019), p. 32.
2019) 100
Stacey E, ‘Future Warfighting in the 2030s: An Interview
<[Link] with Franz-Stefan Gady’ (Strife, 9 September 2020)
[Link]>, 62. <[Link]
96 warfighting-in-the-2030s-an-interview-with-franz-stefan-
See Reed J, Routh A and Mariani J, ‘Information at the
gady/>.
Edge: A Space Architecture for a Future Battle Network’
(Deloitte Insights, 16 November 2020)

14 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
technologies, among them smaller and more 2.6 HUMAN ENHANCEMENT TECHNOLOGIES
expendable low earth orbit satellites that can be
launched into space at much lower cost and in Further down the line, the all-encompassing
much higher number, eventually forming digitalisation of the battlefield might not even
networks of hundreds or even thousands of spare human soldiers themselves. Among the
individual objects able to deliver internet from variety of different concepts to apply the
space, transmit new types of high-resolution emergent field of “human enhancement
earth imagery or even real-time video directly to technologies” to members of the armed forces is
members of the armed forces during ongoing cybernetics. Broadly speaking, the technology
missions, or support various AI applications in comes down to the development of brain-
military systems.101 On that account, it seems machine interfaces (BCI) through brain implants
certain that the importance of space or electrodes placed on the scalp or skull,
architectures will only grow as the digitalisation ultimately enabling “seamless two-way
of the armed forces progresses further, becoming interactions between soldiers and machines as
indispensable, from the perspective of military well as between humans”.103 In the assessment of
and political decision-makers, to the actual experts, such cybernetically enhanced
realisation of the potential of the “digital individuals could remotely operate UAVs or
battlefield”. In turn, this development has weapons systems without the need to use
crucial implications for the required reliability joysticks or other instruments and with
and resilience of space assets both against kinetic potentially increased situational awareness and
attacks and adversarial cyber operations oversight while reducing the complexities
targeting orbiting platforms, communication presented by current stationary user
links, or ground stations – especially in view of interfaces.104 However, at this point, persistent
the fact that at least to date, few of these objects questions regarding feasibility and long-term
exclusively serve military purposes but are effects of such a technology, for example
instead mostly of a dual-use nature, providing regarding the reversibility of the implantation of
essential services for the functioning of civilian electrodes, render the introduction of BCI in the
societies as well.102 military unrealistic at least before the year
2030.105 To some extent perhaps to be considered
the logical endpoint of the “digital battlefield”,
there remains some reluctance not least in the
population at large in light of the far-reaching
ethical implications of such a technology.106

101
Reed J, Routh A and Mariani J, ‘Information at the Edge: A Resolution 75/36’ (2021), p. 1-2.
Space Architecture for a Future Battle Network’ (Deloitte 103
Shereshevsky Y, ‘Are All Soldiers Created Equal? – On the
Insights, 16 November 2020)
Equal Application of the Law to Enhanced Soldiers’ (2021)
<[Link]
61 Virginia Journal of International Law 271, 279.
c-sector/[Link]>;
Stacey E, ‘Future Warfighting in the 2030s: An Interview 104
Emanuel P and others, ‘Cyborg Soldier 2050:
with Franz-Stefan Gady’ (Strife, 9 September 2020) Human/Machine Fusion and the Implications for the
<[Link] Future of the DOD’, p. 7.
warfighting-in-the-2030s-an-interview-with-franz-stefan- 105
gady/>. See Bronk J, ‘Technological Trends’ in Peter Roberts (ed),
The Future Conflict Operating Environment Out to 2030
102
See International Committee of the Red Cross, ‘The (RUSI 2019)
Potential Human Cost of the Use of Weapons in Outer Space <[Link]
and the Protection Afforded by International Humanitarian [Link]>, p. 61.
Law. Position Paper Submitted by the International
106
Committee of the Red Cross to the Secretary-General of the Id., 9-10.
United Nations on the Issues Outlined in General Assembly

15 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
2.7 CONCLUSION: CONVERGENT EFFECTS adversaries; remotely controlled UAVs operating
on different continents – leading to an expansion
There can be no doubt that the described of space.
possibilities enabled by the digital revolution of In turn, increased speed makes the further
military conduct will to a large extent determine employment of AI further indispensable, up to
future strategies in armed conflict, i.e. how and the delegation of critical combat functions such
when states will deploy their armed forces, and as command-and-control or even decisions
to what ends. While much remains uncertain concerning the use of force against adversaries.
and will depend on concrete technological Soon, such autonomous functionalities will
breakthroughs in the coming decade, a few become independent of any one specific
overall trends may be predicted that result from platform, instead being distributed through
convergent effects of the “future digital complex system-of-systems architectures.108 This
battlefield”. development is not least directly correlated with
In the estimation of experts, first, one may the vastly increased amount of data that is
observe two interdependent trends that are produced by pervasive and constant, digitally
directly related to the increasing digitalisation: enabled intelligence and surveillance activities,
on the one hand, the development by states of both online and, thanks to more sophisticated
anti-access and area-denial capabilities that aim sensors, in the “physical” domain. As an expert
at preventing an adversary from entering a remarked, “[w]e have to depend to some degree
physical or digital space; on the other, an on AI and big data, analytic tools, machine
opposing focus on creating opportunities in learning as mechanisms to allow us to deal with
space and time to penetrate those areas that flood of data in the future and inform
physically or digitally by way of multi-domain decision-making using those tools as part of the
operations that leverage the potentials of digital process”. In other words, the increasing
infrastructures. In such a scenario, time becomes digitalisation of warfare begets the need for ever
a significant commodity, the efficient use of more digitalisation.109 At some point, the only
which can be facilitated by the widespread way to handle the enormous technological
employment of novel digital technologies, in complexity of digitally interconnected military
particular those that run with artificial operations makes far-reaching reliance on and
intelligence. Machine-learning algorithms may trust in the AI-supported assets virtually
help both with the quick analysis of incoming inevitable, whether decision-makers are actually
data streams from a multitude of sensors across comfortable with that development or not.110
the conflict zone and enable commanders to Furthermore, the digital revolution of
make faster decisions.107 Whereas this leads to a military conduct might lead to an increase in less
compression of time, the same technologies lethal operations – for example by resorting to
allow for much greater remoteness – cyber covert cyber operations that sabotage adversarial
operations launched against far-away objects without the need to employ kinetic force,

107
Horowitz MC and others, ‘Artificial Intelligence and Killion, Chief Scientist, NATO’ (Defence IQ, 30 July 2018)
International Security’ (Center for a New American Security <[Link]
2018) technology/news/transforming-isr-capabilities-through-ai-
<[Link] machine-learning-and-big-data>.
[Link]>, p. 9. 110
Bronk J, ‘Technological Trends’ in Peter Roberts (ed), The
108
See Sauer F, ‘Autonomy in Weapons Systems: Playing Future Conflict Operating Environment Out to 2030 (RUSI
Catch up with Technology’ (ICRC Humanitarian Law & 2019)
Policy, 29 September 2021) <[Link] <[Link]
policy/2021/09/29/autonomous-weapons-systems- [Link]>, p. 63.
technology/>.
109
Bayley J, ‘Transforming ISR Capabilities through AI,
Machine Learning and Big Data: Insights from Dr. Thomas

16 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
or by launching information operations that
achieve military goals by coercing an enemy 3. IMPLICATIONS FOR
state – or, while using lethal force, at least to a
better protection of civilians and civilian objects HUMANITARIAN PROTECTION
by the use of weapons that are more precise
thanks to AI-supported ISR and command-and- As hinted at in the previous section, the
control.111 At the same time, these trends do not continuing digital revolution of military affairs
necessarily imply a reduction of potential harm comes with great potential to make the conduct
caused by military conduct. As discussed of armed conflict more efficient and powerful in
elsewhere, these same technologies, while the future. At the same time, the entirely novel
perhaps less lethal, provide states with entirely modes of operation that digital technologies
novel tools to exert pressure on adversaries, with enable, especially with the widespread use of AI,
potentially pervasive and persistent negative have far-reaching and to date not completely
systemic effects on affected civilian understood implications for humanitarian
populations.112 This outlook has led one expert to protection, that is the safeguarding of the rights
predict “the predominance of persistent, low- of protected persons and objects that may be
intensity irregular conflict” in the future.113 affected by the conduct of warfare in the digital
Instead of battles between armies, what we will age. Some of the issues concern, for example, the
see are wars “aimed at the control or coercion of normative reach of established rules of
large civilian populations, against whom the international law. Others cast doubt on the
violence is now directed” by means of digital applicability of traditional legal regimes or relate
tools in cyberspace or the information to certain factual risks in connection with the
ecosystem.114 large-scale deployment of digital technologies on
the battlefield. The following sections present an
outline of some of the most pressing subject areas
from a legal perspective. The purpose of this
framing exercise is to sketch out the larger
questions posed by the “future legal battlefield”,
without necessarily presenting satisfying
answers or providing detailed legal analysis. In
total, five broad topics have been identified that
merit closer scrutiny from the academic
community as well as political and military
decision-makers as the digital transformation of
the armed forces advances in the coming decade:
(1) legal thresholds and the application of

111
This was claimed by the Israel Defence Forces in view of Geiß R and Lahmann H, ‘Protecting the Global Information
its widespread employment of AI-supported ISR in its Space in Times of Armed Conflict’ (Geneva Academy of
military campaign in Gaza in May 2021, see Crabtree J, ‘Gaza International Humanitarian Law and Human Rights 2021)
and Nagorno-Karabakh Were Glimpses of the Future of <[Link]
Conflict’ [2021] Foreign Policy files/docman-files/working-
<[Link] papers/Protecting%20the%20Global%20information%20s
karabakh-future-conflict-drones/>. pace%20in%20times%20of%20armed%[Link]>.
112 113
See Geiß R and Lahmann H, ‘Protecting Societies - Van der Waag-Cowling N, ‘Stepping into the Breach:
Anchoring A New Protection Dimension In International Military Responses to Global Cyber Insecurity’ (ICRC
Law In Times Of Increased Cyber Threats’ (Geneva Academy Humanitarian Law & Policy, 17 June 2021)
of International Humanitarian Law and Human Rights <[Link]
2021) <[Link] cyber-insecurity/>.
files/docman-files/working- 114
Id.
papers/Protecting%20Societies%20-%[Link]>;

17 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
existing humanitarian regimes; (2) the further described digital technologies greatly expands
entrenchment of the “military surveillance the military toolbox, providing means of conduct
paradigm”; (3) the continuing dissolution of the that might affect civilian populations or other
spatial and temporal limits of the conflict zone; protected persons and objects without reaching
(4) states’ positive obligations concerning the the stated threshold, thus rendering IHL
vulnerabilities of digital military technologies; inapplicable to such conduct. The assumption
and (5) questions pertaining human control, that the future of conflict will continue to
accountability, and responsibility. Each subject become more and more “digital” suggests that
matter will be addressed in turn. the structural entanglement with the civilian
sphere will only increase further. In this regard,
two broad issue areas can be identified where this
3.1 THRESHOLD QUESTIONS threshold issue will mainly play out.
First, as indicated above, the increasing resort
The principal legal framework to provide to adversarial cyber operations and
humanitarian protection and to seek to limit the (dis)information activities demonstrates how
negative effects of warfare is the body of the novel digital technologies facilitate
international humanitarian law (IHL), also “persistent, low-intensity irregular conflict”116
known as the law of armed conflict (LOAC), that, rather than directly engaging an adversary’s
consisting mainly of the four Geneva armed forces, mainly consists of a strategy of
Conventions of 1949, the two Additional influencing or even coercing its civilian
Protocols of 1977, and corresponding customary population. This increasing dissolution of the
IHL. However, according to common Article 2 of boundaries between the military and the civilian
the Geneva Conventions, the application of these sphere is concerning insofar as conflicts are
rules is contingent on the existence of an armed “fought” beyond the reach of the protective scope
conflict, which is generally considered to of IHL,117 even though there can be no doubt that
presuppose some degree of violence, understood such conduct has potentially far-reaching
as “the resort to armed force” according to a ramifications for affected civilian societies and
landmark ruling of the International Criminal can indeed cause immense harm without the
Tribunal for the Former Yugoslavia.115 need to ever employ kinetic force at all.118
In situations where any of the novel digitally Second, considering the temporal dimension
enhanced military equipment is deployed during of the “digital battlefield”, certain conduct in the
the course of such an armed struggle between context of the digitalisation of military affairs
states, or even between a state and a non-state might be potentially harmful to legally protected
actor as part of a non-international armed persons and objects long before any type of
conflict, there can thus be no doubt that such “conflict” as properly understood even begins.
activity would be covered by the rules of IHL. For one, certain offensive cyber activities that
However, the proliferation of the above amount to a “preparation of the battlefield”, such

115 Protection Dimension In International Law In Times Of


ICTY, Tadić Decision on the Defence Motion for
Interlocutory Appeal on Jurisdiction, 1995, para. 70. Increased Cyber Threats’ (Geneva Academy of International
116
Humanitarian Law and Human Rights 2021)
van der Waag-Cowling N, ‘Stepping into the Breach: <[Link]
Military Responses to Global Cyber Insecurity’ (ICRC files/docman-files/working-
Humanitarian Law & Policy, 17 June 2021) papers/Protecting%20Societies%20-%[Link]>;
<[Link] Geiß R and Lahmann H, ‘Protecting the Global Information
cyber-insecurity/>. Space in Times of Armed Conflict’ (Geneva Academy of
117
Lawson E and Mačák K, ‘Avoiding Civilian Harm from International Humanitarian Law and Human Rights 2021)
Military Cyber Operations During Armed Conflict’ <[Link]
(International Committee of the Red Cross 2021), p. 34. files/docman-files/working-
papers/Protecting%20the%20Global%20information%20s
118
See on this in more detail already Geiß R and pace%20in%20times%20of%20armed%[Link]>.
Lahmann H, ‘Protecting Societies - Anchoring A New

18 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
as the U.S. Cyber Command doctrines of rights law (IHRL). But despite ramped up efforts
“persistent engagement” and “defend to apply a functional approach to the issue of the
forward”,119 are military operations that jurisdictional scope of human rights treaties,122
intentionally remain below the threshold of to what extent rights such as privacy or general
armed conflict.120 But this does not mean that considerations concerning data protection are
they do not have the potential to cause harm in applicable extraterritorially when it comes to the
civilian infrastructures, either by accident or on activities of intelligence agencies remains an
purpose, and to negatively affect the overall open question for the time being.123 The recent
security and safety of infiltrated networks that decision by the Federal Constitutional Court of
might be necessary to control an electric grid, a Germany to extend the reach of constitutionally
water treatment facility, or other critical civil guaranteed rights to privacy against the
infrastructures. Furthermore, the pervasive and surveillance practices of the Federal Intelligence
constant intelligence and surveillance activities Service to non-German data subjects located
made possible by digital technologies – be it on outside of Germany124 has widely been lauded as
the internet or by means of satellites, UAVs, or pointing in the right direction in this regard.125
closed-circuit television – affects targeted However, it remains to be seen whether this
demographics without it being apparent what progressive approach will be taken more broadly
type of legal protection might be engaged at all.121 and beyond the context of an individual state.
These considerations lead to the conclusion When it comes to adversarial military cyber
that a default resort to IHL in search for and information activities below the threshold of
humanitarian protection against such military armed conflict, international legal academia has
conduct might be conceptually misguided. become completely embroiled in extensive
Clearly, the matter needs to be framed more debates surrounding the application and
comprehensively and not simply through the substance of state-centred notions such as the
traditional lens of the laws of armed conflict. principle or rule of sovereignty or the principle
However, the problem is that it is not always of non-intervention.126 These ongoing
clear which legal framework is capable of discussions, which involve the increasingly
stepping in. One obvious candidate, especially as active participation of state representatives
far as AI-enabled ISR activities in “peacetime” are publicly expressing official positions for
concerned, is of course international human example through the GGE and OEWG processes

119
See Fischerkeller MP and Harknett RJ, ‘Persistent of 19 May 2020.
Engagement, Agreed Competition, and Cyberspace 125
See e.g. Çalı B, ‘Has “Control over Rights Doctrine” for
Interaction Dynamics and Escalation’ [2019] The Cyber
Extra-Territorial Jurisdiction Come of Age? Karlsruhe, Too,
Defense Review 267.
Has Spoken, Now It’s Strasbourg’s Turn’ (EJIL: Talk!, 21 July
120 2020) <[Link]
Lawson E and Mačák K, ‘Avoiding Civilian Harm from
Military Cyber Operations During Armed Conflict’ doctrine-for-extra-territorial-jurisdiction-come-of-age-
(International Committee of the Red Cross 2021), p. 34. karlsruhe-too-has-spoken-now-its-strasbourgs-turn/>;
121
Miller RA, ‘The German Constitutional Court Nixes Foreign
Whether IHL is at all concerned with issues of privacy Surveillance’ (Lawfare, 27 May 2020)
and data protection will be addressed in the subsequent <[Link]
section. court-nixes-foreign-surveillance>; Reinke B, ‘Rights
122
See with regard to the right to life UN Human Rights Reaching beyond Borders’ (Verfassungsblog, 30 May 2020)
Committee, ‘General Comment No. 36 (2018) on Article 6 of <[Link]
the International Covenant on Civil and Political Rights, on borders/>.
the Right to Life’ (2018) CCPR/C/GC/36. 126
See only Moynihan H, ‘The Application of International
123
See for a progressive and far-reaching approach already Law to State Cyberattacks: Sovereignty and Non-
Milanovic M, ‘Human Rights Treaties and Foreign Intervention’ (2019)
Surveillance: Privacy in the Digital Age’ (2015) 56 Harvard <[Link]
International Law Journal 81. ations/research/[Link]>.
124
1 BvR 2835/17 (Federal Constitutional Court), judgment

19 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
at UN level, need not be reiterated here. Suffice it mainly revolved around the question of how to
to add that at least regarding offensive yet low- legally protect the availability and integrity of
intensity activities such as those under the data against adversarial cyber operations, most
“persistent engagement” umbrella, there is not prominently by zooming in on the issue whether
yet any sense of emerging consensus whether data can be considered an “object” for the
these types of digital military activities meet any purpose of the rules of targeting in IHL.129 The
limits under international law at all.127 confidentiality of (personal) data, on the other
hand, has so far by and large remained below the
radar.130
3.2 UTILISATION OF DATA AND THE What has been happening in military and
EMERGENCE OF THE MILITARY security affairs over the past 20 years in relation
to the utilisation and exploitation of personal
SURVEILLANCE PARADIGM data gathered through CCTV cameras in cities, by
way of monitoring online behaviour, logging
By now it is not more than a truism to state
location and other sensitive data produced by
that the digital transformation of society runs on
smartphones and other networked devices, and
data, meaning that none of the major shifts that
more recently through footage shot by cameras
the digitalisation of the economy, politics, and
mounted on drones in various contexts has run
the ways in which people interact with each
parallel to the emergence of what has famously
other in their daily lives would be conceivable
been dubbed the “surveillance paradigm” of the
without the constant collection and processing
digital economy.131 As recently observed by a
of vast amounts of data, whether of the personal
federal court in the U.S. in a case concerning
or non-personal variety. Naturally, the same
Facebook’s business practices, the social media
holds true for the “future digital battlefield”;
company “monetizes its personal social
both the conduct of military cyber operations
networking monopoly principally by selling
and especially any application that employs
surveillance-based advertising. Facebook collects
machine-learning algorithms and other forms of
data on users both on its platform and across the
AI is contingent on the availability of data and its
internet and exploits this deep trove of data
real-time collection and analysis.128 However, the
about users’ activities, interests, and affiliations
implications of this development for the future
to sell behavioral advertisements”.132 In the
of humanitarian protection are still not well
security realm, the same principles began to take
understood. So far, scholarly debates have

127
For a detailed discussion of the legal qualification of Under International Humanitarian Law and Human Rights
“persistent engagement” and “defend forward” and the Law’ in Robert Kolb, Gloria Gaggioli and Pavle Kilibarda
status of the “rule of sovereignty” see Lahmann H, ‘On the (eds), Research Handbook on Human Rights and
Politics and Ideologies of the Sovereignty Discourse in Humanitarian Law: Further Reflections and Perspectives
Cyberspace’ [forthcoming 2021] Duke Journal of (2021)
Comparative & International Law. <[Link]
128
Husanjot Chahal, Ryan Fedasiuk and Carrick Flynn, 061>.
Messier Than Oil: Assessing Data Advantage in Military AI, 131
Zuboff S, The Age of Surveillance Capitalism (2019);
Center for Security and Emerging Technology, July 2020, Zuboff S, ‘Be the Friction - Our Response to the New Lords of
[Link] the Ring’ Frankfurter Allgemeine Zeitung (25 June 2013)
assessing-data-advantage-in-military-ai/. <[Link]
129
See only Kubo Mačák, Military Objectives 2.0: The Case paradigm-be-the-friction-our-response-to-the-new-lords-of-
for Interpreting Computer Data as Objects under [Link]>; Cavoukian A, ‘Global Privacy and
International Humanitarian Law (2015) 48 Israel Law Security, by Design: Turning the “Privacy vs. Security”
Review 55. Paradigm on Its Head’ (2017) 7 Health Technologies 329.
132
130
But see Robin Geiß and Henning Lahmann, Protection of Federal Trade Commission v Facebook, Inc [2021] Federal
Data in Armed Conflict (2021) 97 International Law Studies District Court for the District of Columbia 1:20-cv-03590-
556; Lubin A, ‘The Rights to Privacy and Data Protection JEB, para. 3.

20 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
hold after the terrorist attacks of September 11, shot” learning,136 it has been pointed out that for
2001, when political decision-makers concluded the time being, “the only consistently and
that the abstract and vague terrorist threat demonstrably reliable method to ensure that
warranted the establishment of vast surveillance machine learning systems are validated against
architectures that would utilise the wealth of the widest possible degree of variance in data is
private data generated by novel digital to increase the size of the data sets on which they
technologies,133 a development that must be are trained and tested”.137 Using as much data as
considered in the broader context of the turn possibly available is the only way to “identify
towards automated profiling to make decisions edge cases and develop fail-safe mechanisms to
about individuals.134 In this regard, the most far- prevent catastrophic outcomes”.138 The insight
reaching shift in the last few years has been the that larger, and more diverse, datasets lead to
development and increasing employment of better outcomes of algorithmic processes139 is
fusion technologies as described above (see particularly relevant when looking at AI-
section 2.3.3). Whereas in the early stages of the supported military equipment used for ISR or
ramped-up surveillance apparatus, individuals targeting, including but not limited to in lethal
could still reasonably expect at least a degree of autonomous weapons systems. Here, the data
default privacy due to the fact that it remained used both to train the algorithm and during
difficult to correlate intelligence gathered actual missions are almost by default personal
through different digital sources, fusion and directly relate to human beings, be it to pick
technologies render it increasingly impossible to out an individual with facial recognition
hide “in the spaces between each data point”.135 software or to identify a suspicious “pattern of
It is crucial to note that from the life” that may point to a terrorist who will then
technological perspective of machine-learning be targeted by an armed UAV.140 The more such
principles, the “surveillance paradigm” is an “pattern of life” analysis is handed over to
imperative. The way to train and test machine- machine-learning algorithms, the more the
learning algorithms is to feed them with large success of such operations is directly contingent
amounts of relevant data that the system uses to on constant and pervasive multi-source
autonomously build statistical models to make surveillance of the population in the target area.
predictions about future events. Although there In turn, the resulting “sensory overload” leads to
have been efforts more recently to develop a flood of data that can then only be handled by
approaches to ML that do not depend on vast automating the process of analysis141 – a
quantities of data, for example so-called “one mutually reinforcing cycle.

133
See Evans JC, ‘Hijacking Civil Liberties: The USA for Disarmament Research 2021, p. 27.
PATRIOT Act of 2001’ (2002) 33 Loyola University Chicago 138
Flournoy MA, Haines A and Chefitz G, ‘Building Trust
Law Journal 933, 962 et seq.
through Testing’ (2020) <[Link]
134
See Kaltheuner F and Bietti E, ‘Data Is Power: Towards content/uploads/[Link]>, p.9.
Additional Guidance on Profiling and Automated Decision- 139
Taori R and others, ‘Measuring Robustness to Natural
Making in the GDPR’ (2018) 2 Journal of Information
Distribution Shifts in Image Classification’, 34th
Rights, Policy and Practice
Conference on Neural Information Processing Systems
<[Link] p. 5.
(2020) <[Link]
135
Holland Michel A, ‘There Are Spying Eyes Everywhere – [Link]/paper/2020/file/d8330f857a17c53d217014ee776b
And Now They Share a Brain’ [2021] Wired [Link]>, p. 2; Flournoy MA, Haines A and Chefitz G,
<[Link] ‘Building Trust through Testing’ (2020)
everywhere-and-now-they-share-a-brain/>. <[Link]
136
content/uploads/[Link]>, p. 9.
See Flournoy MA, Haines A and Chefitz G, ‘Building Trust
140
through Testing’ (2020) <[Link] See Franz N, ‘Targeted Killing and Pattern-of-Life
content/uploads/[Link]>, p. 9. Analysis: Weaponised Media’ (2017) 39 Media, Culture &
137
Society 111, 114
Holland Michel A, ‘Known Unknowns: Data Issues and
141
Military Autonomous Systems’ (United Nations Institute See Corrin A, ‘Sensory Overload: Military Is Dealing with

21 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
Looking at this from a legal angle, as soon as would imply that IHL prescribes such practices.
such algorithmic decision-making systems are As indicated above, these data are not
deployed during situations of armed conflict in necessarily exclusively personal. It will be as
order to support or take targeting decisions, useful for an AI-supported targeting system or an
applicable international humanitarian law UAV tasked with an ISR mission to be able to
might indeed even prescribe such all- “recognise” a tank and to be able to distinguish it
encompassing and highly intrusive data from a school bus. However, while personal data
collection measures. Article 57(2)(a)(i) are perhaps less relevant in regard to near-peer
Additional Protocol I provides that “those who conflicts that play out on the “digital battlefield”,
plan or decide upon an attack shall do everything they are very much a defining feature of the
feasible to verify that the objectives to be “personalised warfare” of post-9/11
attacked are neither civilians nor civilian objects counterterrorism operations, in the context of
and are not subject to special protection but are which individuals instead of states have become
military objectives (…) and that it is not “imminent threats”. It is in this respect that the
prohibited by the provisions of this Protocol to “military surveillance paradigm” has really
attack them”. Noting that this obligation manifested itself, enabled and reinforced by the
comprises measures by intelligence agencies to development of novel digital technologies.144 In
properly analyse and verify targets prior to regions where this type of conduct is mainly
engagement,142 Asaf Lubin has argued that the being carried out, operating militaries may claim
principle of precautions in attack as stipulated that their strikes have become more precise, with
by Article 57(2) AP I dictates the establishment of fewer civilians ending up as “collateral
a “reasonable intelligence agency” that is able to damage”.145 However, it is easy to see how the
reliably verify the identity of targets prior to a paradigm can turn into a sophisticated yet
strike.143 If reliability in a machine-learning sinister form of population control in affected
system can only – if at all – be achieved with areas, with constant multi-source surveillance
unfettered collection of data relevant for the creating a situation of “perpetual policing”146 in
(geographic) area of deployment, then this which the resident civilian population is aware

a Data Deluge’ (Defense Systems, 4 February 2010) the widespread use of machine-learning systems, see Gross
<[Link] JA, ‘IDF Intelligence Hails Tactical Win in Gaza, Can’t Say
[Link]>. How Long Calm Will Last’ The Times of Israel (27 May 2021)
142
<[Link]
Lubin A, ‘The Rights to Privacy and Data Protection Under
win-over-hamas-but-cant-say-how-long-calm-will-last/>:
International Humanitarian Law and Human Rights Law’
“These advanced capabilities were used to sift through the
in Robert Kolb, Gloria Gaggioli and Pavle Kilibarda (eds),
unimaginably massive amounts of data that Military
Research Handbook on Human Rights and Humanitarian
Intelligence intercepts and collects from Gaza — telephone
Law: Further Reflections and Perspectives (2021)
calls, text messages, surveillance camera footage, satellite
<[Link] images and a huge array of various sensors — in order to
061>, p. 25, referring to Sandoz Y, Swinarski C and turn them into usable intelligence information: where will
Zimmermann B, Commentary on the Additional Protocols a specific Hamas commander be located at a specific time,
of 8 June 1977 to the Geneva Conventions of 12 August for instance. To give a sense of scale of the amount of data
1949 (1987), p. 681. being collected, the IDF said it estimates that any given
143
point in the Gaza Strip was photographed at least 10 times
See Lubin A, ‘The Reasonable Intelligence Agency’ (2021) each day during the conflict. (…) This allowed Military
47 The Yale Journal of International Law Intelligence to not only kill several dozen top operatives
<[Link] from Hamas and the Palestinian Islamic Jihad, the second-
700>. most significant terror group in the Strip, but also to do so
144 with a smaller number of civilian casualties.”
See Bhuta N and Mignot-Mahdavi R, ‘Dangerous
Proportions: Means and Ends in Non-Finite War’ (2021) 146
Franz N, ‘Targeted Killing and Pattern-of-Life Analysis:
Asser Research Paper 2021-01, p. 20-22. Weaponised Media’ (2017) 39 Media, Culture & Society 111,
145 112-114.
According to Israeli media outlets, this is precisely what
happened during the latest IDF campaign in Gaza, thanks to

22 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
that any deviation from vaguely understood procedural safeguards against limitless
“normal behaviour” might result in a lethal collecting and processing of sensitive personal
drone strike, because some employed algorithm data, faces numerous legal obstacles even if the
in an ISR or targeting system flagged said issue of extraterritorial jurisdiction can be
behaviour as likely terrorist activity.147 overcome, as recently suggested by the German
What emerges, then, is a genuine conflict of Federal Constitutional Court.149 Most
interests that appears to be largely unresolved: importantly, most data protection regimes are
The rules of IHL seem to require, at least to some subject to so-called national security exclusions
extent, the collection of large quantities of that, as noted by Lubin, “would seem to block the
personal data in situations in which a machine- relevance of much of the data protection legal
learning system is used to support a decision- framework to AI applications developed for and
making process that leads up to the employment utilized in armed conflict, as well as any
of lethal force. This must encourage the processing conducted by security and
sweeping up of data from all available sources to intelligence agencies”.150 Aside from that, it has
improve target identification. At the same time, been observed that current data protection
for affected populations this implies that the regimes, most importantly the European General
right to privacy is effectively suspended. The Data Protection Regulation, struggle to
ensuing question then becomes whether and adequately account for the real challenges posed
how this fundamental human right can be by algorithmic decision-making systems.151 For
meaningfully realised at all. For one, it is highly the time being, then, the proliferation of AI
doubtful whether IHL provides for any type of systems in ISR and targeting as part of the “digital
data protection to protect an individual’s battlefield” seems not to face many legal hurdles,
privacy, at least as far as the conduct of hostilities which means that the “military surveillance
is concerned.148 As indicated in the previous paradigm” will continue to prevail.
section, this is perhaps of lesser relevance as most
data collection will be conducted during
peacetime anyway. However, the application of
international human rights law, which in
principle would be able to introduce certain

147
How sloppy pattern recognition can end up killing Artificial Intelligence (2022)
civilians has been demonstrated many times over the course <[Link]
of the “war on terror”, whether machine-learning systems 19195>, p. 9; the author points out that the EU draft
had supported the decision to use force or not; see as a proposal for AI regulation explicitly excludes systems
particularly striking example the botched drone strike
developed for military purposes, see Proposal for the
against a putative ISIS-K member in Kabul on 29 August
Regulation of the European Parliament and the Council
2021, Aikins M, ‘Times Investigation: In U.S. Drone Strike,
Evidence Suggests No ISIS Bomb’ The New York Times (10 Laying Down Harmonised Rules on
September 2021) Artificial Intelligence (Artificial Intelligence Act) and
<[Link] Amending Certain Legislative Acts, COM/2021/206 final
[Link]>. (Apr. 21, 2021).
148
The situation differs in relation to, for example, the 151
Dreyer S and Schulz W, ‘The GDPR and Algorithmic
treatment of prisoners of war, which is not the subject of Decision-Making’ (Völkerrechtsblog, 3 June 2019)
this paper. <[Link]
149
See 1 BvR 2835/17 (Federal Constitutional Court); to be algorithmic-decision-making/>; Dreyer S and Schulz W,
sure, the case concerned the application of German Basic ‘The General Data Protection Regulation and Automated
Law and not IHRL. Decision-Making: Will It Deliver?’ (Bertelsmann Stiftung
2019) <[Link]
150
Lubin A, ‘Big Data and the Future of Belligerency: [Link]/fileadmin/files/BSt/Publikationen/GrauePublik
Applying the Rights to Privacy and Data Protection to ationen/[Link]>.
Wartime Artificial Intelligence’ in Robin Geiß and Henning
Lahmann (eds), Research Handbook on Warfare and

23 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
3.3 THE SPATIAL AND TEMPORAL existent outside the context of the “war on
terror” or other conflicts between states and
DISSOLUTION OF THE CONFLICT ZONE transnational armed groups, i.e. as far as state-on-
state conflict is concerned. But even here, the
Closely related to the previous two sections, a
spatial and temporal dissolution of the conflict
further consequence of the digital
zone has increasingly begun to manifest as a
transformation of warfare is a creeping
consequence of novel digital means of military
dissolution of the spatial and temporal
conduct, mainly due to a spread of offensive
boundaries of (armed) conflict. For one, this
cyber activities that enable a persistent presence
development can partly be interpreted as a
in adversarial networks, either for the purpose of
conceptual extension of the global “war on
intelligence gathering – which might involve the
terror”, which – as mentioned above – led to an
copying of sensitive and personal data of other
individualisation of warfare: digital technologies
states’ civilian populations – or in order to
make permanent and globally unconstrained
“prepare the battlefield” as described above.
surveillance of persons possible, and the use of
These activities, too, are occurring during what
force against targets is largely decided on the
is, from a legal perspective, appropriately
basis of certain characteristics that designate
conceived as “peacetime”, yet it leads to a further
individuals as imminent threats to the security
blurring of the boundaries, leading in a constant
of the acting power. This results in a constant
state of quasi- or almost-conflict between states.
anticipation of violence, which in turn prompts
Although mostly directed against governmental
the state to act preemptively against the targeted
or official assets, it is important to note that this
individual.152 In this sense, the global
type of ongoing activity has potential
surveillance practices, and the increasing fusion
repercussions for the civilian populations as
of intelligence from various different sources,
well, for instance as a result of spying activities
creates its own sense of infinite and ultimately
that affect personal data or even of network
unsolvable insecurity. As the past two decades
intrusions that accidentally damage critical
have shown, this constant state of quasi-conflict
civilian infrastructures, especially if the
leads to an increase in civilian casualties, mostly
adversarial state employs indiscriminate
due to “over the horizon”153 drone strikes and
offensive cyber tools such as self-propagating
other types of remote warfare. But perhaps even
malware.154 Moreover, the awareness that other
more significantly, this strategy directly affects
states might constantly be present in one’s own
the well-being of the civilian populations in
networks easily creates the impression of
countries and areas where these missions are
imminent danger and heightened vulnerability –
mainly carried out. With the increasing
not least given the fact that many new digital
proliferation of AI-supported ISR and targeting
weapons technologies enable states to launch
technologies, there is no reason to believe that
attacks faster – which gives states further
this type of low-key, deterritorialised and
incentive to respond in kind and collect more
perpetual conflict will abate in the coming
intelligence through offensive cyber conduct.
decades, as its execution will only become easier
This, in turn, might give rise to a feedback loop of
and thus further entrenched.
ever-greater perception that the adversary
The negative effects stemming from constant
presents a constant threat, a situation that bears
surveillance might be less severe or even non-

152 154
Bhuta N and Mignot-Mahdavi R, ‘Dangerous Proportions: See e.g. the NotPetya malware, a cyber operation that
Means and Ends in Non-Finite War’ (2021) Asser Research caused immense damage in a number of countries without
Paper 2021-01, 20-22. reaching the “armed conflict” threshold, Greenberg A, ‘The
153
Untold Story of NotPetya, the Most Devastating Cyberattack
Szymanski S and Marchman M, ‘“Over-the-Horizon
in History’ [2018] Wired
Operations” in Afghanistan’ (Articles of War, 8 September
<[Link]
2021) <[Link]
ukraine-russia-code-crashed-the-world/>.
operations-afghanistan/>.

24 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
considerable risks of unintended escalation. it becomes that adversaries will be able to hack
As explained in section 3.1, and as evidenced into and potentially sabotage these digital
by the ongoing war on terror, the current systems. When it comes to machine-learning
international legal order has not demonstrated to algorithms, the malicious exploitation of such
be appropriately responsive to this type of vulnerabilities can lead to unforeseeable and
perpetual, low-intensity warfare. Absent the ultimately devastating consequences.157 The
applicability of international humanitarian law, ensuing risks are a function of the degree of
some experts have argued for a more robust complexity of the system.
interpretation of peacetime rules that might be For example, if soldiers in the field rely on a
able to capture some of the novel kinds of “battle management network” that employs
military activities especially in cyberspace. But fusion technologies to gather and streamline
as mentioned, to date it remains highly contested critical information about the current mission
whether notions such as “the rule of and an adversary gains access to the data streams
sovereignty”,155 the principle of non- through an offensive cyber operation, the latter
intervention, or indeed international human might be in a position to alter the data in a way
rights law are capable of stepping in to provide that results in a misleading picture of the tactical
adequate legal protection against some of the situation,158 potentially putting civilians present
more reckless of such offensive cyber operations. in the theatre of conflict in harm’s way. A
“spoofing” attack that replaces a machine-
learning system’s incoming data feed with a fake
3.4 STATES’ POSITIVE OBLIGATIONS one might lead an autonomous vehicle astray
CONCERNING VULNERABILITIES OF DIGITAL and act erroneously,159 which can likewise result
in harm to civilians or civilian objects. Such
WARFARE TECHNOLOGIES manipulation might even already happen during
the algorithm’s training stage by way of “data
Observers have repeatedly pointed out that
poisoning”, that is the injection of directed,
one of the most critical issues in the context of
corrupted disinformation into datasets used for
the digitalisation of the armed forces, especially
the training of the machine-learning system.160
with regard to AI-supported equipment, is the
Attacking AI-supported ISR capabilities in this
virtually inevitable introduction of considerable
way might lead to a flood of false and
cyber vulnerabilities, with potentially far-
untrustworthy intelligence reports that might
reaching consequences.156 It is beyond question
inhibit a military commander’s ability to make
that no code is ever written without flaws, and
informed decisions during combat.161 These risks
the more complex the software, the more likely
155
See Schmitt MN (ed), Tallinn Manual 2.0 on the [Link]>, p. 20.
International Law Applicable to Cyber Operations (2017), 159
rule 4. Holland Michel A, ‘Known Unknowns: Data Issues and
Military Autonomous Systems’ (United Nations Institute
156
See Lawson E and Mačák K, ‘Avoiding Civilian Harm for Disarmament Research 2021), p. 7.
from Military Cyber Operations During Armed Conflict’ 160
(International Committee of the Red Cross 2021), p. 32. Herpig S, ‘Securing Artificial Intelligence. Part 1: The
Attack Surface of Machine Learning and Its Implications’
157
See Herpig S, ‘Securing Artificial Intelligence. Part 1: The (Stiftung Neue Verantwortung 2019)
Attack Surface of Machine Learning and Its Implications’ <[Link]
(Stiftung Neue Verantwortung 2019) [Link]/sites/default/files/[Link]>,
<[Link] p. 16.
[Link]/sites/default/files/[Link]>. 161
Gady F-S, ‘What Does AI Mean for the Future of
158
Zheng DE and Carter WA, ‘Leveraging the Internet of Manoeuvre Warfare?’ (IISS, 5 May 2020)
Things for a More Efficient and Effective Military’ (Center <[Link]
for Strategic & International Studies 2015) <[Link] manoeuvre-warfare>.
[Link]/s3fs-
public/legacyfiles/files/publication/150915ZhengLeveragi

25 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
from vulnerabilities further increase when 36 AP I and the principle of precautions in attack
immensely complex AI infrastructures like pursuant to Article 57 AP I contain provisions
autonomous swarms are involved, which depend that address the risk of harm to civilians
on highly sophisticated and stable emanating from employed military systems.
communication networks that are inherently According to Article 36 AP I, a state is under an
vulnerable to “jamming, spoofing, hacking, obligation to determine whether the
hijacking, manipulation or other electronic employment of a new weapon, means or method
warfare attacks”.162 As reported by Gady, military of warfare would, in some or all circumstances,
decision-makers seem well aware of these be prohibited by the Additional Protocol I or any
potentially extremely consequential other applicable rule of international law.
vulnerabilities and do not expect the situation to Experts have repeatedly been advocating for the
change fundamentally any time soon.163 thorough application of this rule as a way to deal
These example demonstrate that the with the uncertainties of AI technologies in
increasing dependency on AI and other digital military assets.164 However, the utility of such
systems creates real risks not simply for the review mechanisms is arguably limited. When it
functioning of these battlefield infrastructures comes to AI, it is already questionable whether it
by way of adversarial cyber conduct aiming at is ever possible to test a machine-learning system
disabling or neutralising them, which in itself “in all possible scenarios and with all ranges of
would not raise any specific legal issues. Much inputs”.165 It seems even more far-fetched to ever
more important for the context at hand is the expect a review process to reveal all possible
very real possibility that machine-learning vulnerabilities in the system’s source code that at
systems might be manipulated so that their some point in the future might be discovered and
behaviour is altered in unpredictable ways, in subsequently exploited by an adversary. The
worst-case scenarios resulting in the erroneous same holds true for the obligation stemming
targeting of civilians or other protected persons from Article 57(1) AP I to take constant care in
or objects. In light of the rapidly increasing the conduct of military operations to spare the
digitalisation of military assets and more and civilian population, civilians and civilian
more reliance on AI-supported systems, this objects. Even though the use of the notion
poses a lasting and serious problem for the future “military operations” is broad enough to
of humanitarian protection. encompass all kinds of uses of AI in military
States that employ these digital technologies applications – beyond the employment for the
in military systems have positive legal purpose of engaging targets, which as an “attack”
obligations to prevent the causation of harm to is more specifically regulated in Article 57(2) AP
civilians and other protected persons and objects I – it again cannot reasonably expected of a state
due to malfunction or erroneous behaviour as a to accurately predict all ways a machine-learning
result of an adversarial cyber operation against system might malfunction and harm civilians as
them. Different rules in international law exist as the result of an adversarial cyber operation
a basis for this type of obligation. For one, both against the system.
the duty to test new weapons pursuant to Article Positive obligations can furthermore be found

162
Ekelhof M and Persi Paoli G, ‘Swarm Robotics: Technical Dealing with the Challenges Posed by Emerging
and Operational Overview of the Next Generation of Technologies’ (Stockholm International Peace Research
Autonomous Systems’ (United Nations Institute for Institute 2017)
Disarmament Research 2020), p. 54. <[Link]
163 12/[Link]>.
See Gady F-S, ‘What Does AI Mean for the Future of
Manoeuvre Warfare?’ (IISS, 5 May 2020) 165
Flournoy MA, Haines A and Chefitz G, ‘Building Trust
<[Link] through Testing’ (2020) <[Link]
manoeuvre-warfare>. content/uploads/[Link]>, p. 8.
164
See Boulanin V and Verbruggen M, ‘Article 36 Reviews:

26 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
in peacetime international law, most pertinently systems.168 However, there is no reason not to
in international human rights law under the expect these issues to resurface more broadly, for
right to life, provided the issue of extraterritorial example when machine-learning systems
application can be overcome.166 This, too, could support ISR or similar types of applications, in
potentially be interpreted as amounting to a duty light of the fact that the amount of analysed data
to ensure that employed machine-learning and the inherent opaqueness of the algorithmic
systems cannot be manipulated in such a way as processes will render effective human oversight
to render their behaviour uncontrollable and and control oftentimes very difficult. The
unpredictable, endangering the life of affected fundamental consideration how to achieve and
individuals. But whether or not such an guarantee meaningful human control is
obligation based on IHRL is accepted in therefore no less important in contexts beyond
principle, the applicable standard cannot be LAWS – as explicitly acknowledged for example
assumed to go beyond an obligation of observing by the ICRC.169
due diligence, which would arguably not capture The (otherwise persuasive) assertion that
all hardly detectable, possible vulnerabilities in because any decision to employ an AI-supported
the system’s software. The risk that an system must ultimately have been made by an
adversarial cyber attack leads to unpredictable individual, human accountability always
malfunctioning of an AI system is, for the time remains intact,170 can perhaps solve the majority,
being at least, virtually ineradicable. but likely not all cases concerning unintended
harm caused by an algorithm. Especially when it
comes to the support of a human decision
3.5 HUMAN CONTROL: QUESTIONS through the automatic processing and analysis
PERTAINING TO ACCOUNTABILITY AND of vast datasets, the ways in which actual human
control takes a back seat may be subtle and
RESPONSIBILITY perhaps even barely detectable. While in such
scenarios, it might seem pretty straightforward
Finally, urgent questions pertaining to the
to assign accountability to the human operator
issue of (meaningful) human control over AI-
who had relied on the (faulty) automated
supported military applications and a possible
analysis to take a critical decision, the
“accountability gap” resulting from certain
uncomfortable truth may be that at some point,
features of these technologies, most significantly
with ever-increasing amounts of data, humans
due to an inherent lack of predictability
simply do not retain the cognitive capabilities
regarding the outcomes of dynamic processes by
necessary to assess and evaluate the outcomes of
machine-learning algorithms,167 have so far
an algorithmic process and can only put their
mostly been discussed more narrowly in the
trust in the reliability of the machine or abstain
context of lethal autonomous weapons

166
See only Milanovic M and Schmitt MN, ‘Cyber Attacks Denver Journal of International Law & Policy 1.
and Cyber (Mis)Information Operations During a 169
International Committee of the Red Cross, ‘Artificial
Pandemic’ (2020) 11 Journal of National Security Law &
Intelligence and Machine Learning in Armed Conflict: A
Policy 247, 281-282.
Human-Centred Approach’ (2019)
167
See Holland Michel A, ‘Known Unknowns: Data Issues <[Link]
and Military Autonomous Systems’ (United Nations and-machine-learning-armed-conflict-human-centred-
Institute for Disarmament Research 2021), p. 17-18. approach>.
168 170
See only Verdriesen I, Santoni de Sio F and Dignum V, See Bayley J, ‘Transforming ISR Capabilities through AI,
‘Accountability and Control Over Autonomous Weapon Machine Learning and Big Data: Insights from Dr. Thomas
Systems: A Framework for Comprehensive Human Killion, Chief Scientist, NATO’ (Defence IQ, 30 July 2018)
Oversight’ (2021) 31 Mind and Machines 137; Chengeta T, <[Link]
‘Accountability Gap: Autonomous Weapon Systems and technology/news/transforming-isr-capabilities-through-ai-
Modes of Responsibility in International Law’ (2016) 45 machine-learning-and-big-data>.

27 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
from using it at all.171 Here, the fundamental
question remains how to re-establish the 4. CONCLUDING REMARKS
possibility of human control in the first place, a
question that perhaps cannot be answered by As this framing paper has demonstrated,
taking a shortcut to default operator while the development of the “future digital
accountability. Therefore, much is left to discuss battlefield” might provide states with hitherto
concerning this most critical aspect of the use of inconceivable opportunities to carry out highly
AI in the military. sophisticated, effective, and potentially less
lethal and destructive military operations, this
does in no way mean that these come without
serious risks for civilian populations. The third
section has highlighted a few of the intricate
legal questions in regard to future humanitarian
protection that must urgently be asked as the
digitalisation of warfare proceeds at a rapid pace.
So far, few issues can be said to have been
resolved and sincere debates must continue, not
least among states, how to ensure that the future
of military operations does not turn into
complete dystopia. To that end, the paper may
serve as a guideline for future legal, ethical, and
political-science research that focuses on the
convergent effects of the digital transformation
rather than disparate subject matters such as
disinformation campaigns or lethal autonomous
weapons systems. Above all else, sections 3.1 to
3.3 have exposed the pressing need to tackle the
primary issue of what legal regime is supposed to
govern a wide variety of prospective military
activities that involve potentially profound
ramifications for affected civilian populations.
With the emergence of the digital battlefield, the
clear-cut distinction between war and peace that
is at the root of international humanitarian law
is fast becoming obsolete once again, and the
broader system of international law must prove
responsive to this development so as to remain
relevant for the regulation of states’ conduct of
warfare.

171
See Herpig S, ‘Securing Artificial Intelligence. Part 1: The an analysis provided by machine learning, how much
Attack Surface of Machine Learning and Its Implications’ transparency about this analysis is needed and where will
(Stiftung Neue Verantwortung 2019) this require unconditional trust that the analysis is correct
<[Link] and was not interfered with?”
[Link]/sites/default/files/[Link]>,
p. 35: “If a human follows through with a decision based on

28 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
BIBLIOGRAPHY
 Afina Y, ‘Rage Against the Algorithm: The Risks of Overestimating Military Artificial
Intelligence’ (Chatham House, 27 August 2020) <[Link]
against-algorithm-risks-overestimating-military-artificial-intelligence>
 Ahronheim A, ‘Israel’s Operation against Hamas Was the World’s First AI War’ The Jerusalem Post
(27 May 2021) <[Link]
first-ai-war-669371>
 Aikins M, ‘Times Investigation: In U.S. Drone Strike, Evidence Suggests No ISIS Bomb’ The New
York Times (10 September 2021) <[Link]
[Link]>
 Barnett J, ‘Air Force Moving Project Maven into Advanced Battle Management System Portfolio’
(FedScoop, 10 August 2020) <[Link]
battle-management-system/>
 ——, ‘Latest ABMS Tests Break New Barriers on AI and Edge Cloud Capabilities’ (FedScoop, 18
March 2021) <[Link]
 Barrie D and Childs N, ‘Air Power’s Future: Combat Aircrew Not yet Surplus to Requirements’
(IISS Military Balance Blog, 24 July 2020) <[Link]
balance/2020/07/air-power-future-autonomous-platforms>
 Bayley J, ‘Transforming ISR Capabilities through AI, Machine Learning and Big Data: Insights
from Dr. Thomas Killion, Chief Scientist, NATO’ (Defence IQ, 30 July 2018)
<[Link]
ai-machine-learning-and-big-data>
 Ben-Yishai R, ‘How Data and AI Drove the IDF Operation in Gaza’ YNet News (29 May 2021)
<[Link]
 Bergman R and Fassihi F, ‘The Scientist and the A.I.-Assisted, Remote-Control Killing Machine’
The New York Times (18 September 2021)
<[Link]
[Link]?referringSource=articleShare>
 Bhuta N and Mignot-Mahdavi R, ‘Dangerous Proportions: Means and Ends in Non-Finite War’
(2021) Asser Research Paper 2021-01
 Boulanin V and Verbruggen M, ‘Article 36 Reviews: Dealing with the Challenges Posed by
Emerging Technologies’ (Stockholm International Peace Research Institute 2017)
<[Link]
 Bronk C and Anderson GS, ‘Encounter Battle: Engaging ISIL in Cyberspace’ (2017) 2 The Cyber
Defense Review 93
 Bronk J, ‘Technological Trends’ in Peter Roberts (ed), The Future Conflict Operating Environment Out
to 2030 (RUSI 2019)
<[Link]
 Buchanan B and Cunningham FS, ‘Preparing the Cyber Battlefield: Assessing a Novel Escalation
Risk in a Sino-American Crisis’ (2020) 3 Texas National Security Review 54
 Çalı B, ‘Has “Control over Rights Doctrine” for Extra-Territorial Jurisdiction Come of Age?
Karlsruhe, Too, Has Spoken, Now It’s Strasbourg’s Turn’ (EJIL: Talk!, 21 July 2020)
<[Link]
come-of-age-karlsruhe-too-has-spoken-now-its-strasbourgs-turn/>
 Cavoukian A, ‘Global Privacy and Security, by Design: Turning the “Privacy vs. Security”
Paradigm on Its Head’ (2017) 7 Health Technologies 329
 CBS News, ‘Israel Claims 200 Attacks Predicted, Prevented with Data Tech’ CBS News (12 June
2018) <[Link]
privacy-civil-liberties/>
 Chahal H, Fedasiuk R and Flynn C, ‘Messier than Oil: Assessing Data Advantage in Military AI’
(Center for Security and Emerging Technology 2020)
 Chengeta T, ‘Accountability Gap: Autonomous Weapon Systems and Modes of Responsibility in
International Law’ (2016) 45 Denver Journal of International Law & Policy 1
 Corrin A, ‘Sensory Overload: Military Is Dealing with a Data Deluge’ (Defense Systems, 4 February
2010) <[Link]
[Link]>
 Crabtree J, ‘Gaza and Nagorno-Karabakh Were Glimpses of the Future of Conflict’ [2021] Foreign
Policy <[Link]
 Cramer M, ‘A.I. Drone May Have Acted on Its Own in Attacking Fighters, U.N. Says’ The New York
Times (3 June 2021) <[Link]
 Dar Y, ‘Israel Says It Fought World’s First “Artificial Intelligence War” Against Hamas’ The
Eurasian Times (29 May 2021) <[Link]
artificial-intelligence-war-against-hamas/>
 Defense Intelligence Agency, ‘Challenges to Security in Space’ (2019)
<[Link]
Threat_V14_020119_sm.pdf>
 Delerue F, Cyber Operations and International Law (2020)
 Dorsey J and Amaral N, ‘Military Drones in Europe: Ensuring Transparency and Accountability’
(Chatham House 2021) <[Link]
[Link]>
 ——, ‘Transparency, Accountability and Legitimacy—Chatham House Report on Military
Drones in Europe, Part I’ (Opinio Juris, 21 May 2021)
<[Link]
report-on-military-drones-in-europe-part-i/>
 Dreyer S and Schulz W, ‘The General Data Protection Regulation and Automated Decision-
Making: Will It Deliver?’ (Bertelsmann Stiftung 2019) <[Link]
[Link]/fileadmin/files/BSt/Publikationen/GrauePublikationen/[Link]>
 ——, ‘The GDPR and Algorithmic Decision-Making’ (Völkerrechtsblog, 3 June 2019)
<[Link]

30 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
 Dunhill J, ‘First “AI War”: Israel Used World’s First AI-Guided Swarm Of Combat Drones In Gaza
Attacks’ IFL Science (2 July 2021) <[Link]
worlds-first-aiguided-swarm-of-combat-drones-in-gaza-attacks/>
 Ekelhof M and Persi Paoli G, ‘Swarm Robotics: Technical and Operational Overview of the Next
Generation of Autonomous Systems’ (United Nations Institute for Disarmament Research 2020)
 Emanuel P and others, ‘Cyborg Soldier 2050: Human/Machine Fusion and the Implications for
the Future of the DOD’
 Evans JC, ‘Hijacking Civil Liberties: The USA PATRIOT Act of 2001’ (2002) 33 Loyola University
Chicago Law Journal 933
 Fischerkeller MP and Harknett RJ, ‘Persistent Engagement, Agreed Competition, and Cyberspace
Interaction Dynamics and Escalation’ [2019] The Cyber Defense Review 267
 Flournoy MA, Haines A and Chefitz G, ‘Building Trust through Testing’ (2020)
<[Link]
 Franz N, ‘Targeted Killing and Pattern-of-Life Analysis: Weaponised Media’ (2017) 39 Media,
Culture & Society 111
 Frisk A, ‘What Is Project Maven? The Pentagon AI Project Google Employees Want out Of’ (Global
News, 5 April 2018) <[Link]
 Gady F-S, ‘What Does AI Mean for the Future of Manoeuvre Warfare?’ (IISS, 5 May 2020)
<[Link]
 Gady F-S and Stronell A, ‘What the Nagorno-Karabakh Conflict Revealed About Future
Warfighting’ (World Politics Review, 19 November 2020)
<[Link]
revealed-about-future-warfighting>
 Geiß R and Lahmann H, ‘Protection of Data in Armed Conflict’ (2021) 97 International Law
Studies 556
 ——, ‘Protecting Societies - Anchoring A New Protection Dimension In International Law In
Times Of Increased Cyber Threats’ (Geneva Academy of International Humanitarian Law and
Human Rights 2021) <[Link]
papers/Protecting%20Societies%20-%[Link]>
 ——, ‘Protecting the Global Information Space in Times of Armed Conflict’ (Geneva Academy of
International Humanitarian Law and Human Rights 2021) <[Link]
[Link]/joomlatools-files/docman-files/working-
papers/Protecting%20the%20Global%20information%20space%20in%20times%20of%20arm
ed%[Link]>
 Greenberg A, ‘The Untold Story of NotPetya, the Most Devastating Cyberattack in History’ [2018]
Wired <[Link]
world/>
 Greenwald G, ‘Inside the Mind of NSA Chief Gen Keith Alexander’ The Guardian (15 September
2013) <[Link]
star-trek>
 Gross JA, ‘IDF Intelligence Hails Tactical Win in Gaza, Can’t Say How Long Calm Will Last’ The

31 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
Times of Israel (27 May 2021) <[Link]
hamas-but-cant-say-how-long-calm-will-last/>
 ——, ‘In Apparent World First, IDF Deployed Drone Swarms in Gaza Fighting’ The Times of Israel
(10 July 2021) <[Link]
swarms-in-gaza-fighting/>
 Hammes TX, ‘The Future of Warfare: Small, Many, Smart vs. Few & Exquisite?’ (War on the Rocks,
16 July 2014) <[Link]
few-exquisite/>
 Harris M, ‘Phantom Warships Are Courting Chaos in Conflict Zones’ [2021] Wired
<[Link]
crimea/?utm_source=pocket_mylist>
 Herpig S, ‘Securing Artificial Intelligence. Part 1: The Attack Surface of Machine Learning and Its
Implications’ (Stiftung Neue Verantwortung 2019) <[Link]
[Link]/sites/default/files/securing_artificial_intelligence.pdf>
 Hickman PL, ‘The Future of Warfare Will Continue to Be Human’ (War on the Rocks, 12 May 2020)
<[Link]
 Hoffman S, ‘The U.S.-China Data Fight Is Only Getting Started’ [2021] Foreign Policy
<[Link]
policies/?utm_source=pocket_mylist>
 Hoffman S and Attrill N, ‘Supply Chains and the Global Data Collection Ecosystem’ (Australian
Strategic Policy Institute 2021) Policy Brief 45/2021 <[Link]
[Link]/ad-aspi/2021-
06/Supply%[Link]?VersionId=56J_tt8xYXYvsMuhriQt5dSsr92ADaZH>
 Holland Michel A, ‘Known Unknowns: Data Issues and Military Autonomous Systems’ (United
Nations Institute for Disarmament Research 2021)
 ——, ‘There Are Spying Eyes Everywhere – And Now They Share a Brain’ [2021] Wired
<[Link]
 Horowitz MC and others, ‘Artificial Intelligence and International Security’ (Center for a New
American Security 2018) <[Link]
International-Security-July-2018_Final.pdf>
 International Committee of the Red Cross, ‘Autonomous Weapon Systems: Is It Morally
Acceptable for a Machine to Make Life and Death Decisions?’ (ICRC, 13 April 2015)
<[Link]
 ——, ‘Autonomy, Artificial Intelligence and Robotics: Technical Aspects of Human Control’
(2019)
 ——, ‘International Humanitarian Law and the Challenges of Contemporary Armed Conflicts’
(International Committee of the Red Cross 2019)
 ——, ‘The Potential Human Cost of Cyber Operations’ (2019)
<[Link]
 ——, ‘Artificial Intelligence and Machine Learning in Armed Conflict: A Human-Centred
Approach’ (2019) <[Link]

32 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
learning-armed-conflict-human-centred-approach>
 ——, ‘The Potential Human Cost of the Use of Weapons in Outer Space and the Protection
Afforded by International Humanitarian Law. Position Paper Submitted by the International
Committee of the Red Cross to the Secretary-General of the United Nations on the Issues Outlined
in General Assembly Resolution 75/36’ (2021)
 Jensen BM, Whyte C and Cuomo S, ‘Algorithms at War: The Promise, Peril, and Limits of Artificial
Intelligence’ (2020) 22 International Studies Review 526
 Kaltheuner F and Bietti E, ‘Data Is Power: Towards Additional Guidance on Profiling and
Automated Decision-Making in the GDPR’ (2018) 2 Journal of Information Rights, Policy and
Practice <[Link]
 Konaev M, ‘With AI, We’ll See Faster Fights, But Longer Wars’ (War on the Rocks, 29 October 2019)
<[Link]
 Kumon T, ‘The First AI Conflict? Israel’s Gaza Operation Gives Glimpse of Future’ Nikkei Asia (28
June 2021) <[Link]
Gaza-operation-gives-glimpse-of-future>
 Lahmann H, ‘On the Politics and Ideologies of the Sovereignty Discourse in Cyberspace’ [2021]
Duke Journal of Comparative & International Law
 Lawson E, ‘Into the Ether: Considering the Impact of the Electromagnetic Environment and
Cyberspace on the Operating Environment’ in Peter Roberts (ed), The Future Conflict Operating
Environment Out to 2030 (RUSI 2019)
<[Link]
 Lawson E and Mačák K, ‘Avoiding Civilian Harm from Military Cyber Operations During Armed
Conflict’ (International Committee of the Red Cross 2021)
 Lewis DA, ‘Legal Reviews of Weapons, Means and Methods of Warfare Involving Artificial
Intelligence: 16 Elements to Consider’ (ICRC Humanitarian Law & Policy, 21 March 2019)
<[Link]
warfare-artificial-intelligence-16-elements-consider/>
 Lubin A, ‘The Reasonable Intelligence Agency’ (2021) 47 The Yale Journal of International Law
<[Link]
 ——, ‘The Rights to Privacy and Data Protection Under International Humanitarian Law and
Human Rights Law’ in Robert Kolb, Gloria Gaggioli and Pavle Kilibarda (eds), Research Handbook
on Human Rights and Humanitarian Law: Further Reflections and Perspectives (2021)
<[Link]
 ——, ‘Big Data and the Future of Belligerency: Applying the Rights to Privacy and Data Protection
to Wartime Artificial Intelligence’ in Robin Geiß and Henning Lahmann (eds), Research Handbook
on Warfare and Artificial Intelligence (2022)
<[Link]
 Makewar A, ‘Israel Used First-Ever AI-Guided Combat Drone Swarm in Gaza Attacks’ (6 July 2021)
<[Link]
attacks/19940/>
 Margulies J, ‘9/11 Forever’ [2021] The Boston Review <[Link]

33 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
security/joseph-margulies-911-forever>
 Margulies J and Azmy B, ‘The Humanity of Michael Ratner, The Fabrications of Samuel Moyn’
(Just Security, 13 September 2021) <[Link]
ratner-the-fabrications-of-samuel-moyn/?utm_source=pocket_mylist>
 Mégret F, ‘Are There “Ineherently Sovereign Functions” in International Law?’ (2021) 115
American Journal of International Law 452
 Milanovic M, ‘Human Rights Treaties and Foreign Surveillance: Privacy in the Digital Age’ (2015)
56 Harvard International Law Journal 81
 Milanovic M and Schmitt MN, ‘Cyber Attacks and Cyber (Mis)Information Operations During a
Pandemic’ (2020) 11 Journal of National Security Law & Policy 247
 Miller RA, ‘The German Constitutional Court Nixes Foreign Surveillance’ (Lawfare, 27 May 2020)
<[Link]
 Moynihan H, ‘The Application of International Law to State Cyberattacks: Sovereignty and Non-
Intervention’ (2019)
<[Link]
[Link]>
 National Reconnaissance Office, ‘NRO Key Talking Points: Sentient’ (September 2016)
<[Link]
00108_C05112983.pdf>
 Park D and Walstrom M, ‘Cyberattack on Critical Infrastructure: Russia and the Ukrainian Power
Grid Attacks’ (The Henry M. Jackson School of International Studies, 11 October 2017)
<[Link]
grid-attacks/>
 Pethokoukis J, ‘How AI Is like That Other General Purpose Technology, Electricity’ (AEIdeas, 25
November 2019) <[Link]
technology-electricity/>
 Reed J, Routh A and Mariani J, ‘Information at the Edge: A Space Architecture for a Future Battle
Network’ (Deloitte Insights, 16 November 2020)
<[Link]
[Link]>
 Reinke B, ‘Rights Reaching beyond Borders’ (Verfassungsblog, 30 May 2020)
<[Link]
 Royal Marines, ‘Drone Swarms Support Commando Forces Trials in a First for the UK’s Armed
Forces’ (Royal Navy, 17 July 2021) <[Link]
activity/news/2021/july/17/210715-autonomous-advance-force-4>
 Sandoz Y, Swinarski C and Zimmermann B, Commentary on the Additional Protocols of 8 June 1977
to the Geneva Conventions of 12 August 1949 (1987)
 Sauer F, ‘Autonomy in Weapons Systems: Playing Catch up with Technology’ (ICRC
Humanitarian Law & Policy, 29 September 2021) <[Link]
policy/2021/09/29/autonomous-weapons-systems-technology/>
 Scharre P, ‘Between a Roomba and a Terminator: What Is Autonomy?’ (War on the Rocks, 18

34 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
February 2015) <[Link]
is-autonomy/>
 ——, ‘Robots at War and the Quality of Quantity’ (War on the Rocks, 26 February 2015)
<[Link]
 ——, ‘Unleash the Swarm: The Future of Warfare’ (War on the Rocks, 4 March 2015)
<[Link]
 Schmitt MN (ed), Tallinn Manual 2.0 on the International Law Applicable to Cyber Operations (2017)
 Schultz RH and Clarke RD, ‘Big Data at War: Special Operations Forces, Project Maven, and
Twenty-First Century Warfare’ (Modern War Institute, 25 August 2020)
<[Link]
first-century-warfare/>
 Scoles S, ‘It’s Sentient: Meet the Classified Artificial Brain Being Developed by US Intelligence
Programs’ [2019] The Verge <[Link]
reconnaissance-office-spy-satellites-artificial-intelligence-ai>
 Shereshevsky Y, ‘Are All Soldiers Created Equal? – On the Equal Application of the Law to
Enhanced Soldiers’ (2021) 61 Virginia Journal of International Law 271
 Smagh NS, ‘Intelligence, Surveillance, and Reconnaissance Design for Great Power Competition’
(Congressional Research Service 2020) R46389 <[Link]
 Stacey E, ‘The Future of Cyber Warfare – An Interview with Greg Austin’ (Strife, 26 April 2020)
<[Link]
austin/>
 ——, ‘Future Warfighting in the 2030s: An Interview with Franz-Stefan Gady’ (Strife, 9 September
2020) <[Link]
with-franz-stefan-gady/>
 Stickings A, ‘Space, Strategic Advantage and Control of the Military High Ground’ in Peter
Roberts (ed), The Future Conflict Operating Environment Out to 2030 (RUSI 2019)
<[Link]
 Stumborg M, ‘See You in a Month: AI’s Long Data Tail’ (War on the Rocks, 17 October 2019)
<[Link]
 Szymanski S and Marchman M, ‘“Over-the-Horizon Operations” in Afghanistan’ (Articles of War,
8 September 2021) <[Link]
 Taori R and others, ‘Measuring Robustness to Natural Distribution Shifts in Image Classification’,
34th Conference on Neural Information Processing Systems (2020) <[Link]
[Link]/paper/2020/file/[Link]>
 Temple-Raston D, ‘How the U.S. Hacked ISIS’ (NPR, 26 September 2019)
<[Link]
 The Economist, ‘Open-Source Intelligence Challenges State Monopolies on Information’ [2021]
The Economist <[Link]
challenges-state-monopolies-on-information>
 Thornton R and Miron M, ‘The Advent of the “Third Revolution in Military Affairs”; Is the UK

35 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
Now Facing the Threat of AI-Enabled Cyber Warfare(?)’ (Defence-In-Depth, 21 July 2020)
<[Link]
the-uk-now-facing-the-threat-of-ai-enabled-cyber-warfare/>
 Toomey P, ‘Caught In the Internet: For the NSA, Phones Were Only the Beginning’ [2015] Foreign
Affairs <[Link]
 UN Human Rights Committee, ‘General Comment No. 36 (2018) on Article 6 of the International
Covenant on Civil and Political Rights, on the Right to Life’ (2018) CCPR/C/GC/36
 UN Security Council, ‘Letter Dated 8 March 2021 from the Panel of Experts on Libya Established
Pursuant to Resolution 1973 (2011) Addressed to the President of the Security Council’ (UN
Security Council 2021) S/2021/229 <[Link]
 van der Waag-Cowling N, ‘Stepping into the Breach: Military Responses to Global Cyber
Insecurity’ (ICRC Humanitarian Law & Policy, 17 June 2021) <[Link]
policy/2021/06/17/military-cyber-insecurity/>
 Verdriesen I, Santoni de Sio F and Dignum V, ‘Accountability and Control Over Autonomous
Weapon Systems: A Framework for Comprehensive Human Oversight’ (2021) 31 Mind and
Machines 137
 Vergun D, ‘Experts Predict Artificial Intelligence Will Transform Warfare’ (DoD News, 5 June
2020) <[Link]
artificial-intelligence-will-transform-warfare/>
 Villasenor J, ‘How to Deal with AI-Enabled Disinformation’ (Brookings, 23 November 2020)
<[Link]
 Walch K, ‘Is AI Overhyped?’ [2020] Forbes
<[Link]
 Wareham M, ‘Stopping Killer Robots. Country Positions on Banning Fully Autonomous
Weapons and Retaining Human Control’ (Human Rights Watch 2020)
<[Link]
fully-autonomous-weapons-and#>
 Warrell H, ‘UK Targeted ISIS Drones and Online Servers in Cyber Attack’ Financial Times (7
February 2021) <[Link]
 Winkler JD and others, ‘Reflections on the Future of Warfare and Implications for Personnel
Policies of the U.S. Department of Defense’ (RAND Corporation 2019)
 Work J, ‘The American Way of Cyber Warfare and the Case of ISIS’ (Atlantic Council, 17 September
2019) <[Link]
warfare-and-the-case-of-isis/>
 Zetter K, ‘NATO Researchers: Stuxnet Attack on Iran Was Illegal “Act of Force”’ (Wired, 25 March
2013) <[Link]
 Zheng DE and Carter WA, ‘Leveraging the Internet of Things for a More Efficient and Effective
Military’ (Center for Strategic & International Studies 2015) <[Link]
[Link]/s3fs-
public/legacy_files/files/publication/150915_Zheng_LeveragingInternet_WEB.pdf>
 Zimmermann A, ‘Stop Building Bad AI’ [2021] Boston Review <[Link]

36 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
nature/annette-zimmermann-stop-building-bad-
ai?utm_source=Boston+Review+Email+Subscribers&utm_campaign=66efccfe63-
MC_Newsletter_7_21_21&utm_medium=email&utm_term=0_2cb428c5ad-66efccfe63-
41268478&mc_cid=66efccfe63&mc_eid=c319e80a73>
 Zuboff S, ‘Be the Friction - Our Response to the New Lords of the Ring’ Frankfurter Allgemeine
Zeitung (25 June 2013) <[Link]
[Link]>
 ——, The Age of Surveillance Capitalism (2019)
 Federal Trade Commission v Facebook, Inc [2021] Federal District Court for the District of Columbia
1:20-cv-03590-JEB
 1 BvR 2835/17 (Federal Constitutional Court)

37 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer
The Geneva Academy of International Humanitarian Law and Human Rights
The Geneva Academy provides post-graduate education, conducts academic legal research and policy studies, and organizes
training courses and expert meetings. We concentrate on branches of international law that relate to situations of armed
conflict, protracted violence, and protection of human rights.

The Geneva Academy of International © The Geneva Academy of International


Humanitarian Law and Human Rights Humanitarian Law and Human Rights

Villa Moynier This work is licensed for use under a


Rue de Lausanne 120B Creative Commons Attribution-Non-
CP 1063 - 1211 Geneva 1 - Switzerland Commercial-Share Alike 4.0 International
Phone: +41 (22) 908 44 83 License (CC BY-NC-ND 4.0)
Email: info@[Link]
[Link]

38 I Working Paper: The Future Digital Battlefield and Challenges for Humanitarian Protection: A Primer

You might also like