Defence has always been at the forefront of innovation, with considerable funds being allocated to Science & Technology (S&T) progression among the world’s most advanced military nations. Given the threat to life involved in warfare, it is no surprise that robotics receives considerable attention amongst these communities.
In addition, the increasing prevalence of Cyber & Electromagnetic Activities (CEMA) on the battlespace clearly lends itself to the use of Robotic and Automated systems, owing to their ability to receive, process and analyse digital information at pace. So where within defence is robotics likely to have the greatest impact?
First and foremost, let’s tackle the ‘Terminator’-related images that spring to mind for most movie-goers when imagining robotics in war. The International Law of Armed Conflict (LOAC) applies four principles to any act of war: Proportionality, Distinction, Military Necessity and Humanity.
The principle of greatest concern when considering any kind of autonomous targeting is likely Distinction – that is, can the system provide enough assurance that the intended target is correct, and that the likelihood of error or collateral damage is small enough to make the resulting action justifiable?
Human-in-the-loop
Any autonomous system could be considered as a black box to the military user, as it’s unlikely military personnel would be able to read and understand the software through which the system makes decisions.
Errors in judgement are therefore both unpredictable and unexplainable, so a user will find it very difficult to be assured that the system will always comply with the law. Even more importantly, the ethical considerations behind such a decision are extremely complex and require a much broader understanding of the operational environment (far beyond that comprehendible by the systems of today) to consider proportionality, necessity, or humanity.
As a result, the UK MOD’s policy is to always have a human in-the-loop where offensive weapons are concerned. While a system may support targeting and processing information, or assessing the impact of an action, a human will always be responsible for pulling the trigger and always bear responsibility for doing so.
In addition to the above, walking robots are somewhat impractical: despite some genuinely having better mobility than my own, they are heavy, have large power requirements, are not inconspicuous and are difficult to maintain in the field. While I can imagine ‘follower’ robots conducting tasks like carrying supplies or evacuating casualties in future, we’re still proving that concept with relatively simpler land and airborne vehicles today.
Autonomous resupply
Project THESEUS, named after the mythical slayer of the Minotaur, is the UK MOD’s effort to conduct autonomous resupply. It focuses on ‘last-mile resupply’ at the tactical edge, which is typically where the threat to life is the greatest. It makes perfect sense, then, to avoid risking our soldiers for activities that could quite conceivably be managed by a robotic platform.
Imagine a soldier on the frontline is running low on ammunition, and so the storeman at the rear loads up a drone or self-driving vehicle with some boxes or crates and either sends it off to the soldier’s location, or perhaps offers it a route via plotting some waypoints.
The same platform could be used to deliver medical supplies safely around the battlefield or be loaded with stretchers to move people more quickly back through the medical support chain without taking resources away from the front. Of course, Amazon is making strides in drone-based shipping under similar principles.
Land-resupply tends to be more difficult, particularly in warzones where infrastructure may be destroyed, or the recipients are located in difficult to reach hideouts. Anecdotally, I was told a story of how during one trial a vehicle identified a farmer’s track as a road and promptly tipped itself over.
Due to the black box nature described above, it’s challenging for users to understand why that decision was made and until sufficient time and testing has passed, there will always be a fear that an unknown decision could disrupt an operation at a moment’s notice.
For that reason, today these testing environments typically have more engineers present from industry than there are soldiers testing the equipment. There are additional challenges to consider in the employment of such systems: the load carried will be valuable to potential interceptors and so will need an element of protection. Also, routes selected will need to cater for the tactical environment – that is, not give away the position of a unit by loudly parking up next to it.
Swarming drones
Robotics may also be employed in a pack, rather than individually, to achieve greater effect. The concept of swarming drones has been present for some time and is now being tested in earnest.
A swarm of drones can offer several benefits: they can quickly cover wide areas, employing a wide variety of formations, for example. They are harder to defeat due to their sheer number; any capability shared across the swarm will gracefully degrade rather than being immediately lost should single drones fail. Finally, different drones can have different payloads or tasks, creating a flexible unit with a broad range of potential effect, much like a group of riflemen carrying different equipment.
Whether via edge-computing or passing data through a network for processing elsewhere, a swarm of drones has a fantastic ability to soak up and react to information from across the battlefield. Likely applications will include Intelligence, Surveillance, Target Acquisition and Reconnaissance (ISTAR) – in short, observing and analysing the battlefield – Electronic Warfare (EW) – comprising receiving and reacting to wireless signals of interest from communications and sensing systems, and Communications, creating ad-hoc networks to move information around the battlefield.
These data-centric tasks are one area where robotics and automation can excel in defence today: very quickly processing vast amounts of varied information and sharing the insights across a network at speeds that humans simply can’t keep up with. Employing elements of autonomy allows the swarm to be commanded by a single individual, meaning one soldier can do the jobs of many, which in defence parlance is referred to as a ‘force-multiplier’.
Spot the dog
Finally, a note on robot dogs. You will be hard-pressed to attend a defence exhibition without tripping over a Spot, peering up at you with a camera-face or offering some form of marketing collateral. Purchased and paraded by leading military units across the world, some scepticism remains about their utility amongst the wider military community. Perhaps they are better considered as a novel platform rather than a stand-alone solution.
Employment examples include offering perimeter surveillance patrols (fleets can conduct complex routes across uneven ground before resting to re-charge) and counter-IED tasks (where today’s C-IED robots can be defeated by stairs, much like Daleks).
But can they be weaponised? Boston Dynamic’s policy is no, whereas Ghost Robotics has a slightly more relaxed position. They will not be alone. As competitor companies, nations, and non-state actors progressively weaponise their own robotics, we must find ways to ensure the Laws of Armed Conflict continue to be upheld.