If the question of killer robots comes back to the fore quite often, that of autonomous nuclear weapons is more worrying. Some of these weapons may indeed already exist and others are in the pipeline. An NGO therefore advises the international community to act in favor of their abandonment.
A type of armament deemed absurd
Currently, several conventional nuclear weapons already have a certain degree of autonomy. Nevertheless, their triggering obviously remains subject to a very strict human decision-making process. Let us cite for example the new American tactical missiles B61-12, so the launch goes through, among other things, the verification of a whole series of conditions by an automatic system. Among these conditions, we find the fact of reaching a certain altitude, the connection to the right launcher plane as well as the blocking of a trajectory towards the ground.
Is this the last step towards autonomous nuclear weapons? According to the Bulletin of the Atomic Scientists, such weapons are indeed in the pipeline and some may even already exist. For example, China is reportedly testing a hypersonic missile capable of orbiting for an indefinite period before firing in an attempt to hit its target on the Earth’s surface.
Also, as Popular Mechanics explains, the Bulletin of the Atomic Scientists recently issued an important alert. This NGO, already popular for its countdown to the apocalypse (Doomsday Clock), calls on the international community to ban autonomous nuclear weapons projects, a type of armament that it considers absurd.
A risk of global and nuclear catastrophe
Talking about autonomous nuclear weapons suggests the worst possible scenarios. For example, the Russian army’s Burevestnik “Skyfall” is equipped with nuclear propulsion, a nuclear warhead as well as an artificial intelligence system. What if this type of weapon could one day deliberately make the final decision? Obviously, the response to an attack would be quick, but in the worst case, the world would witness an unprecedented mess potentially spelling disaster.
Still in Russia, let’s mention the Poseidon 2M39 (or Status-6 Poseidon) that the CIA nicknamed Kanyon (see image below). This army presented as a “nuclear torpedo with tsunamis” is stealthy and obviously nuclear-powered. The torpedo in question would be able to navigate stealthily near the coastline of an enemy country and swoop down on a major city to activate its nuclear charge. Here again, it is a question of a very strict human decision-making process. But what would happen if we entrusted an AI with the mission of making predictions, calculating the decision or, even worse, making the same decision without consulting the human?
Currently, the international community and many scientists are ensuring the debate about killer robots. Should a machine with AI be allowed to kill? If so, should she be able to make this decision alone? The Bulletin of the Atomic Scientists believes that these questions should also apply to autonomous nuclear weapons. The difference is also very important since these weapons can generate a real global catastrophe, a risk all the same much less present with killer robots.