1
Of War Dogs, Bat Bombs, Mercenaries and Killer Robots
âShall I compare thee to a summerâs day?â muses William Shakespeare as he reflects on the qualities of his beloved in the opening line of Sonnet 18. Plato, seeking to understand the nature of reality itself and our place in it, creates the famous allegory of a group of people chained in a fire-lit cave and watching shadows on the wall. These examples illustrate that, when we are wrestling with something that is challenging to express or comprehend, it is natural for us to reach for analogies. We look to other things that are relevantly similar, in the hope that they will help us to grasp the thing we are trying to understand. That can be very useful and we can learn a lot that way, but only if the analogy is well chosen (Shakespeare wouldnât have done particularly well if heâd decided to compare his beloved to a bit of pocket lint!).
While they have important antecedents, the lethal autonomous weapons systems (LAWS) that are on the near horizon (or, in some cases, already in production) are a new and challenging phenomenon, and so we naturally find ourselves looking for analogies that may help us to understand how to respond to them. Unfortunately, we often reach for unhelpful and unedifying analogies. Most common are analogies drawn from science fiction, whether it be the Schwarzenegger-shaped Terminator or the creepily measured tones of HAL 9000 from the classic movie 2001: A Space Odyssey. The year 2001 was a long time ago, but we are still nowhere near having AI reach the level of general AI that could enable âthe machinesâ to turn against their human masters. The issue at hand is weapons that are autonomous â many of which are likely to be driven by relatively pedestrian algorithms â rather than weapons that have achieved a level of intelligence that matches or surpasses that of human beings (what is usually referred to as âgeneral AIâ). There are certainly good reasons to be very cautious about developing weapons that incorporate general AI, but itâs important to see that the questions to be asked are different from those we need to answer about autonomy in general. For one thing, if a system is that intelligent, then we will have reached the point at which we need to decide whether the AI is capable of being held ethically and legally responsible for its actions. By contrast, as we will see, one of the main concerns about merely autonomous weapons is precisely that it is widely agreed that they cannot be held accountable, which opponents argue may leave an accountability gap.
So, if we are not to look to sci-fi for analogies that may help us to come to grips with the ethical issues associated with LAWS, what other options are available? A mostly overlooked but far more useful comparison is with animals.
The Dogs (and Horses, and Dolphins, and Pigeons, and Bats) of War
Animals have long been âweaponizedâ (to use a term currently in vogue). The horses ridden by armoured knights in the Middle Ages were not mere transport. They were instead an integral part of the weapons system â they were taught to bite and kick, and the enemy was as likely to be trampled by the knightâs horse as to taste the steel of his sword. There have been claims that US Navy dolphins âhave been trained in attack-and-kill missions since the Cold Warâ (Townsend 2005), though this has been strongly denied by official sources. Even more bizarrely, during the Second World War the noted behaviourist B. F. Skinner led an effort to develop a pigeon-controlled guided bomb, a precursor to todayâs guided anti-ship missiles. Using operant conditioning techniques, a pigeon housed within the weapon (which was essentially a steerable glide bomb) was trained to recognize an image of an enemy ship projected onto a small screen by lenses in the warhead. Should the image shift from the centre of the screen, the pigeon was trained to peck at the controls, which would adjust the bombâs steering mechanism, thereby putting it back on target. Despite what Skinner reports to have been a project of considerable promise, Project Pigeon (or Project ORCON â from âorganic controlâ, as it became known after the war) was cancelled, largely as a result of improvements in electronic means of missile control (Skinner 1960).
The strangeness of Project Pigeon is matched or even exceeded by another Second World War initiative: Project X-Ray. Conceived by Lytle S. Adams, a dental surgeon and an acquaintance of First Lady Eleanor Roosevelt, this was an effort to weaponize bats. The plan was to attach small incendiary devices to Mexican free-tailed bats and airdrop them over Japanese cities. It was intended that, on release from their bomb-shaped delivery system, the bats would disperse and roost in eaves and attics, among the traditional wood-and-paper Japanese buildings. Once ignited by a small timer, the napalm-based incendiary would then start a fire that was expected to spread rapidly. The project was cancelled as efforts to develop the atomic bomb gained priority, but not before one accidental release of some âarmedâ bats resulted in a fire at a US base that burned both a hangar and a generalâs car (Madrigal 2011).
The animals most commonly used as weapons, though, are probably dogs. An early example comes from the mid-seventh century BC, when the basic tactical unit of mounted forces from the Greek polis of Magnesia on the Maeander (present-day Ortaklar in Turkey) consisted of a horseman, a spear bearer and a war dog. It was recorded that the Magnesiansâ approach during their war against the Ephesians was to first release the dogs, who would break up the enemy ranks, then follow that up with a rain of spears, and finally complete the attack with a cavalry charge (Foster 1941, 115). Today, of course, dogs continue to play important military roles. They are trained and used as sentries and trackers, to detect mines and IEDs, and for crowd control. For the purposes of this book, though, it is the âcombat assault dogsâ that accompany and support special operations forces that are of the greatest relevance.
These dogs are usually equipped with bodymounted video cameras and are trained to enter buildings and seek out the enemy. This enables the dog handler to reconnoitre enemy-held positions without in the process putting soldiersâ lives at risk. New technologies are also being developed to enhance the humanâdog team. For example, in October 2020 the US military announced that it was working with Command Sight Inc. to develop augmented reality glasses for military dogs. According to the press release, â[t]he augmented reality goggles are specially designed to fit each dog with a visual indicator that allows the dog to be directed to a specific spot and react to the visual cue in the goggles. The handler can see everything the dog sees to provide it commands through the glassesâ (US Army 2020).
In addition to serving as a means of reconnaissance, combat assault dogs are also trained to attack anyone they discover who is armed (Norton-Taylor 2010). The dog itself is not usually responsible for killing the enemy combatant, instead it works to enable the soldiers it accompanies to employ lethal force; we might think of the dog as part of a lethal combat system. But at least one unconfirmed recent report indicates that it may not always be the case that the enemy is not directly killed by the combat assault dog. According to a newspaper report, in 2018 a British combat assault dog was part of a UK SAS patrol in northern Syria, when the patrol was ambushed. A source quoted in the report gave the following account:
The handler removed the dogâs muzzle and directed him into a building from where they were coming under fire. They could hear screaming and shouting before the firing from the house stopped. When the team entered the building they saw the dog standing over a dead gunmanâŚ. His throat had been torn out and he had bled to death ⌠There was also a lump of human flesh in one corner and a series of blood trails leading out of the back of the building. The dog was virtually uninjured. The SAS was able to consolidate their defensive position and eventually break away from the battle without taking any casualties. (Martin 2018)
Are there any ethical issues of concern relating to the employment of dogs as weapons of war? I know of no published objections in this regard, beyond concerns for the safety and well-being of the dogs themselves,1 which â given that the well-being of autonomous weapons is not an issue of interest here2 â is not the sort of objection of relevance to this book. That, of course, is not to say that there are no ethical issues that might be raised here. I shall return to this question in what follows, in drawing a comparison between dogs, contracted combatants and autonomous weapons. First, though, I turn to consider what seems to me to be another useful analogy for LAWS, namely mercenaries or contracted combatants.
The âDogs of Warâ
In Just Warriors, Inc: The Ethics of Privatized Force (Baker 2011), I set out to explore the ethical objections to the employment of private military and security contractors in contemporary conflict zones. Are they mercenaries, and, if so, what is it about mercenarism that is ethically objectionable? Certainly, the term âmercenaryâ has predominantly pejorative connotations, which is why I chose to employ the neutral phrase âcontracted combatantsâ in my exploration, so as not to prejudge its outcome. Other common pejoratives for contracted combatants are âwhores of warâ and âdogs of warâ. While âwhores of warâ provides a fairly obvious clue to one of the normative objections to contracted combatants (to be discussed later), I did not address the pejorative âdogs of warâ in the book simply because I was unable at the time to identify any meaningful ethical problem associated with it. Perhaps, however, the analogy is a better fit than I then realized, as will become clear in the next pages. In what follows I outline the main arguments that emerged from my exploration in Just Warriors, Inc.
Perhaps the earliest thinker to explicitly address the issue of what makes contracted combatants ethically problematic is Niccolò Machiavelli, in The Prince. Two papers addressing the ethics of contracted combatants, one written by Anthony Coady (1992) and another by Tony Lynch and Adrian Walsh (2000), both take Machiavelliâs comments as their starting point. According to these authors, Machiavelliâs objections to mercenaries were effectively threefold:
- Mercenaries are not sufficiently bloodthirsty.
- Mercenaries cannot be trusted because of the temptations of political power.
- There exists some motive or motives appropriate for engaging in war that mercenaries necessarily lack, or else mercenaries are motivated by some factor that is inappropriate for engaging in war.
The first of these points need not detain us long, for it is quite clear that, even if the empirically questionable claim that mercenaries lack the killing instinct necessary for war were true, this can hardly be considered a moral failing. But perhaps the point is instead one about effectiveness, the claim being that the soldier for hire cannot be relied upon to do what is necessary in battle when the crunch comes. But, even if this claim is true, it is evident that it cannot be the moral failing we are looking for either. For, while we might cast moral aspersions on such a mercenary, those aspersions would be in the family of such terms as âfeebleâ, âpatheticâ or âhopelessâ. But these are clearly not the moral failings we are looking for in trying to discover just what is wrong with being a mercenary. Indeed, the flip side of this objection seems to have more bite â namely the concern that mercenaries may be overly driven by âkiller instinctâ, that they might take undue pleasure in the business of causing death. This foreshadows the motivation objection discussed later.
Machiavelliâs second point is even more easily dealt with. For it is quite clear that the temptation to grab power over a nation by force is at least as strong for national military forces as it is for mercenaries. It could even be argued that mercenaries are more reliable in this respect. For e...