Drones and Responsibility
eBook - ePub

Drones and Responsibility

Legal, Philosophical and Socio-Technical Perspectives on Remotely Controlled Weapons

Ezio Di Nucci, Filippo Santoni de Sio

Share book
  1. 217 pages
  2. English
  3. ePUB (mobile friendly)
  4. Available on iOS & Android
eBook - ePub

Drones and Responsibility

Legal, Philosophical and Socio-Technical Perspectives on Remotely Controlled Weapons

Ezio Di Nucci, Filippo Santoni de Sio

Book details
Book preview
Table of contents
Citations

About This Book

How does the use of military drones affect the legal, political, and moral responsibility of different actors involved in their deployment and design? This volume offers a fresh contribution to the ethics of drone warfare by providing, for the first time, a systematic interdisciplinary discussion of different responsibility issues raised by military drones. The book discusses four main sets of questions: First, from a legal point of view, we analyse the ways in which the use of drones makes the attribution of criminal responsibility to individuals for war crimes more complicated and what adjustments may be required in international criminal law and in military practices to avoid 'responsibility gaps' in warfare. From a moral and political perspective, the volume looks at the conditions under which the use of military drones by states is impermissible, permissible, or even obligatory and what the responsibilities of a state in the use of drones towards both its citizens and potential targets are. From a socio-technical perspective, what kind of new human machine interaction might (and should) drones bring and which new kinds of shared agency and responsibility? Finally, we ask how the use of drones changes our conception of agency and responsibility. The book will be of interest to scholars and students in (military) ethics and to those in law, politics and the military involved in the design, deployment and evaluation of military drones.

Frequently asked questions

How do I cancel my subscription?
Simply head over to the account section in settings and click on “Cancel Subscription” - it’s as simple as that. After you cancel, your membership will stay active for the remainder of the time you’ve paid for. Learn more here.
Can/how do I download books?
At the moment all of our mobile-responsive ePub books are available to download via the app. Most of our PDFs are also available to download and we're working on making the final remaining ones downloadable now. Learn more here.
What is the difference between the pricing plans?
Both plans give you full access to the library and all of Perlego’s features. The only differences are the price and subscription period: With the annual plan you’ll save around 30% compared to 12 months on the monthly plan.
What is Perlego?
We are an online textbook subscription service, where you can get access to an entire online library for less than the price of a single book per month. With over 1 million books across 1000+ topics, we’ve got you covered! Learn more here.
Do you support text-to-speech?
Look out for the read-aloud symbol on your next book to see if you can listen to it. The read-aloud tool reads text aloud for you, highlighting the text as it is being read. You can pause it, speed it up and slow it down. Learn more here.
Is Drones and Responsibility an online PDF/ePUB?
Yes, you can access Drones and Responsibility by Ezio Di Nucci, Filippo Santoni de Sio in PDF and/or ePUB format, as well as other popular books in Technology & Engineering & Military Science & Technology. We have over one million books available in our catalogue for you to explore.

Information

Part I Drones and Legal Responsibility

DOI: 10.4324/9781315578187-2

1 Autonomous Drones and Individual Criminal Responsibility

Dan Saxon
DOI: 10.4324/9781315578187-3

1 Introduction

This chapter will describe the impact that the introduction of autonomous drones to the battlespace may bring to efforts to hold individuals responsible for violations of the laws of war and gross violations of International Human Rights Law, such as crimes against humanity. 1 It will review theories of criminal responsibility that may be applicable when autonomous drones are employed to perpetrate crimes. I argue that the use of kinetic autonomous drones should not necessarily produce unacceptable ‘accountability gaps.’ However, an ‘accountability dichotomy’ may develop between states with the resources to field autonomous drones that can communicate and record all relevant military data and less sophisticated states and armed groups that do not have access to this technology. Moreover, as the speed and military advantage of autonomous technology increases, a danger exists that drone operators and their commanders will gradually cede their human control and supervision over such weapons. In that scenario, proving individual criminal responsibility for incidents involving autonomous drones will be more challenging. For the purposes of this chapter, the term ‘autonomous drones’ refers to airborne weapon platforms that have the capability to make and execute targeting decisions without human supervision (U.S. Department of Defense 2012: 13–14).

1.1 Autonomous Drones and Challenges for Commanders

Under international law, the existence of individual criminal responsibility often turns on the intent or mens rea of the accused. Deployment of an autonomous weapon system, however, suggests a far more complex mental process, than, for example, firing a pistol at the enemy or the deployment of a unit of human-operated drones in the battlespace. The soldier who aims his pistol at a human target (1) will understand the capability of that weapon to injure or kill the target and (2) will ‘know,’ to the best of her ability, whether the target is an enemy combatant or a civilian. Similarly, the General who deploys a fleet of drones into a battlespace must consider, inter alia, (1) the range, accuracy, and explosive power of the drone’s weaponry to be directed at the enemy, (2) the possible presence of civilians in the area, and (3) the levels of training, stamina, discipline, and supervision of the drone’s human pilots, including their ability to comply with International Humanitarian Law (IHL). 2 Thus, the General who deploys human drone crews must understand and have confidence in the ability of those soldiers to think about the changing circumstances of their battlespace and to make appropriate judgments consistent with their mission.
Let us assume, however, that in 2065 the General deploys a brigade of autonomous drones. These robotic airborne vehicles, possessing great firepower, would have sufficient artificial intelligence to locate and identify targets that have been programmed into their computer software and destroy those targets, while respecting the rules of IHL and any pertinent rules of engagement. The option to deploy unmanned aircraft without human pilots and gunners on board could save many lives.
Nevertheless, this option also would increase the demands on the General’s already taxed mental capacity. Prior to such deployment, the General must consider (1) the first two points previously mentioned vis-à-vis human-operated drones, (2) the autonomous drone’s ability to comply with IHL in the particular (fluid or not) battlespace, (3) whether the mission or the expected circumstances of the battlespace may require the exercise of increased levels of human supervision and control over the robotic airborne vehicles (i.e. restrictions on autonomy), (4) whether the General and/or her staff will have the capacity to deactivate immediately the autonomous drone should conditions require it, (5) the robustness of the software that operates the artificial intelligence of the autonomous drone, in particular whether enemy forces may have the ability to tamper with and/or take control over the autonomous drone(s), (6) the level of training – technical, operational and with respect to the laws of war – of the human ‘operators’ or monitors of the autonomous weapon systems (if any), and (7) whether the General and/or her staff will be able to monitor the rapid communications that will occur between autonomous drones and intervene when necessary. This last challenge will be particularly complex if the autonomous drones are deployed as a form of ‘swarm’ technology, i.e. large numbers of autonomous mobile weapon platforms that communicate amongst themselves and adjust their behaviour – individually and as a ‘swarm’ – when necessary to achieve the mission. 3
Should one or more of the autonomous drones attack and kill civilians, on what basis might individual criminal responsibility accrue to the General who ordered their deployment? If the General intentionally deployed the autonomous drones with the knowledge that they were deficient vis-à-vis one or more of the targeting rules mentioned in Articles 48–59 of Additional Protocol I (Geneva Conventions Additional Protocol I 1977) or with deliberate indifference to the existence of such deficiencies, arguably the General conducted an indiscriminate attack for which she should be held accountable [Geneva Conventions Additional Protocol I 1977: Art. 51(4), Art. 85(3)(b)].
From a legal perspective, more complex tasks and/or more limited autonomous technology will signal a demand for greater levels of human judgement (as well as communication and accountability) during the mission. Therefore, a sound understanding of the function, capabilities, and limitations of the semi-autonomous and autonomous weapon technologies available to armed forces will become a prerequisite for the command of modern military units. Increased autonomy will enable higher speeds of military engagement, resulting in greater military advantages for the armies that field them. In parallel, considerations of military advantage mean that the increasing speed with which future weapon systems will absorb, process and transmit information and react to events may influence decisions about acceptable levels of human judgement and permissible levels of autonomy (Jackson 2014).

1.2 Implications of the Inevitable Employment of Increasingly Autonomous Drones

The potential military advantages for states and organized armed groups in possession of autonomous weapon technologies and capabilities will lead, inevitably, to the increasing use of autonomous drones (Corn 2014). However, to ensure compliance with IHL, the intervention of human judgement on the activity of autonomous drones will be required at four distinct stages of military operations: (1) the procurement/acquisition stage, (2) the planning stage of the mission or attack when a human must choose which weapon system to employ (systems will vary across a range of autonomy), (3) following the choice of an autonomous drone, a decision as to the level of human attention – if any – to assign to the system during the mission but prior to an attack, and (4) specific inputs of human judgement – if necessary – to comply with international legal obligations and/or political interests, immediately before, during, and after the attack. During phases 2–4, human control over the autonomous drone should remain until the human supervisor ensures, at each of the final three stages, that the weapon system com-plies and will continue to comply with international law and applicable rules of engagement (Saxon 2014).
When accidents or crimes occur during and/or due to the use of autonomous drones, the actual degree of autonomy and the levels of human supervisory control over the machine must form part of any accountability analysis. Once the speed of autonomous technology reaches levels that preclude effective human supervision and control, however, proof of the existence of the mens rea 4 may be illusory and/or impossible to establish. Arguably, this could result in an accountability gap as the underlying rationale for the mens rea requirement in criminal law is that a sense of personal blame is absent if the accused did not in some way intend her action or omission (Supreme Court of Canada R v. Finta 1994: 760).

2 Applicable Law

2.1 Targeting Rules and Unlawful Attacks under IHL

The rules governing the selection and attack of targets in Articles 48–59 of API will govern many of the crimes where autonomous drones are employed. Articles 48 and 52 enshrine the duty of parties to an armed conflict to distinguish between the civilian population and combatants and between civilian objects and military objectives and thus to direct operations only against combatants and/or military objectives. 5 Article 51(4) prohibits indiscriminate attacks, which include:
  1. those that are not directed at a specific military objective; 6
  2. those that employ a method or means of combat that cannot be directed at a specific military objective; or
  3. those that employ a method or means of combat the effects of which cannot be limited as required by API. 7
Article 54, which reflects customary international law [Schmitt 2013: Rule 81 (1)], prohibits attacks against objects that are indispensable to the survival of the civilian population ‘for the specific purpose of denying them for their sustenance value to the civilian population or to the adverse Party, whatever the motive’. Such indispensable objects would include food supplies, crops ripe for harvest, drinking water reservoirs and water distribution systems. 8 Article 56 bans attacks against works or installations containing dangerous forces, i.e. dams, dykes, and nuclear power plants.
Article 57 addresses the required precautions that ‘those who plan or decide upon’ an attack must exercise due diligence to avoid or minimize civilian casualties. Planners and executors of attacks must do everything feasible to verify that the target of the attack is a military objective, and the provisions of the Additional Protocol I (API) do not forbid the operation [Geneva Convention API: Art. 57(2) (a)(i)]. 9 Preliminarily, as Boothby observes, Article 57 contains no requirement that persons execute attacks (Boothby 2012: 20). However, in 1977 when the drafters of API completed their work, autonomous weapon systems were not foreseeable. Indeed, during the drafting process of Article 57, the ‘relevant discussion’ as to the meaning of ‘those who plan or decide upon an attack’ concerned the question whether Article 57 obligations could be fulfilled only at a certain level of command, given that planners and decision-makers need appropriate information on which to base considerations about precautionary measures. 10 Thus, it would be inaccurate to suggest that the drafters of Article 57 intended the scope of the word ‘those’’ to encompass autonomous machines. 11 IHL must apply to the use of autonomous weapon systems (like any other weapon systems). Therefore, the onus for taking the precautionary measures described in Article 57, therefore, must remain with the human commanders and operators who have the capacity to exercise their judgement over the employment of autonomous weapon systems.
In addition, belligerent forces must ‘take...

Table of contents