A seven-minute documentary, titled “Slaughterbots” caused consternation at the last United Nations Convention on Certain Conventional Weapons. It was made by the Future of Life Institute, an advocacy NGO. Slaughterbots is a part of the Campaign to Stop Killer Robots (CSKR), a movement backed by Human Rights Watch and funded by Elon Musk, among others. The movie shows how small drones, powered by artificial intelligence (AI), could use facial recognition and Internet protocol tracing to track down people by identifying their faces or locating their mobile phones or vehicles. These drones could then kill their targets using tiny explosive charges at close range, leaving bystanders unharmed. According to AI researchers, it is already possible to create devices that work on these lines. Such a device could do its killing, without human intervention or oversight, if the target parameters were set.
The CSKR is one of several coalitions lobbying to pass laws that ban the development and use of “Lethal Autonomous Weapons” (LAWs). Much military equipment is autonomous, but most current weaponry keeps humans in the loop for critical decisions. An autonomous predator drone shoots only on human orders. LAW devices take humans out of the loop at an earlier stage. A letter released at the last International Joint Conference on Artificial Intelligence says, “Autonomous weapons select and engage targets without human intervention. They might include, for example, armed quadcopters that can search for and eliminate people meeting certain pre-defined criteria… the deployment of such systems is — practically, if not legally — feasible within years… autonomous weapons have been described as the third revolution in warfare, after gunpowder and nuclear arms.”
LAW devices are being developed for all sorts of tasks; they are already used in anti-aircraft and anti-missile systems. Israel’s Iron Dome autonomously targets and destroys missiles. Germany’s Mantis, a group of radar-guided cannons, protects army bases against rocket assaults. Although neither Iron Dome nor Mantis targets people, future systems will. The Samsung Techwin SGR-A1 sentry gun is deployed in the demilitarised zone (DMZ) between the two Koreas. This 115 kg device uses infrared and laser rangefinders to detect and target movement in the DMZ. Then it uses voice recognition to identify friend or foe. If the right password is not spoken, the gun can sound an alarm, launch grenades, or fire. A much larger LAW is the Sea Hunter, an anti-submarine unmanned trimaran, which weighs 135 tonnes and has a 19,000 km cruising range. The Sea Hunter completed sea-trials in 2016. It will be deployed for anti-submarine warfare and mine-sweeping.
There are many arguments in favour of LAWs, including the use of toned-down versions with tranquillisers or tear gas for law enforcement and riot control. In war, LAWs could save soldiers’ lives and be a formidable force multiplier. If selective targeting is possible, these could lead to less collateral damage. LAWs do not need exotic materials, which makes them cost-effective. But such devices could also slaughter huge numbers of civilians or malfunction in disastrous ways. Unlike with landmines or dum-dum bullets, there are no laws prohibiting LAW devices. They were inconceivable when the Geneva Convention was last reviewed in the 1970s. There are fierce ongoing debates about what constitutes a LAW versus a device that keeps humans in the loop.
India is one among many non-committal nations. Obviously deploying sentry guns along the borders is a tempting prospect. Going by history, if the technology exists, or it can be developed, it will be developed. Bans, if they occur at all, will be behind the technological curve. The third revolution could, therefore, alter the nature of conflict in the 21st century.
To read the full story, Subscribe Now at just Rs 249 a month