Are we still masters of our own fate?
“Common sense prevailed.” San Francisco supervisor Hillary Ronen used these words to praise the vote on Police-operated “killer-robots”. Citizens protested after it was announced that the San Francisco police department would be allowed to deploy robots capable of using “lethal force in extreme situations”. Now, only a few days later, the city’s board of supervisors repealed the decision.
What happened in San Francisco is a key example of what has been occupying the minds of philosophers, politicians and military decision-makers for some time now: The relation between humans and autonomous or remote-controlled machines, for example: drones or robots, in the context of security threats and war.
Whether it is an American MQ-9 Reaper drone hitting civilians in Afghanistan or a Ukrainian drone boat hitting Russian destroyers in the Black Sea, these technological advancements bring huge advantages to security forces. They do not have to deploy human personnel in extremely dangerous environments, such as sites of mass shootings or in minefields, thus allowing them to operate without risking human damage. While these are undoubtedly valid reasons for their use, yet, besides the inherent moral problems, there are significant security risks that come along with the high-scale use of these technologies – especially in a military context.
One of the characteristics of technology is that it makes killing easier. Philosopher Gregoire Chamayou explores this problem and its implications in his work “Theory of the Drone”
“The price most killers have to pay for a close-range kill—the memory of the ‘face, terrible, twisted in pain and hate, yes, such hate’—this price need never be paid if we can simply avoid looking at our victim’s face.”
Drones put a considerable distance between the killer, i.e. the drone operator, and the victim and reduce the latter to a bunch of pixels on a screen. In contrast to a fistfight or close combat with firearms, the killer’s resistance to kill another human being is lowered since there is not really a perception of doing exactly that. Killing becomes less demanding on the human side. From a military point of view, the logistical efforts of stationing forces abroad get reduced with the use of this tech, since it is enough to provide mechanics and engineers on the ground while pilots and data analysts can be back at base in the homeland. The worrying consequence of this is that due to these lower political, monetary and human capital costs decision-makers’ hesitance to order and approve the use of this tech, thus oftentimes the end of someone’s life, is minimized – as realist theory would say: war becomes more likely.
These technologies have been a cruel reality, especially in the Middle East for around 20 years now. But the future brings about an even darker picture: autonomous drones, fully-fledged killer robots that make decisions about life and death by themselves without human control or responsibility, are among the technologies. The problem here is that these technologies eliminate individual responsibility and guilt. Bomber pilots pulling the trigger above Dresden or Hiroshima knew what they were doing. They would burden themselves with the eternal guilt of having killed another human being – even if they did it for the right reasons. This has always been a part of war and should always remain a part of it. If a human soul is taken, the one who took it needs to stay connected to that forever.
The fact that autonomous weapons make that decision by themselves while never being able to feel that guilt – effectively taking human fate out of human hands – creates the absolute necessity to ban them internationally like cluster bombs, bio-, and chemical weapons before.
The debate of autonomous weapons can already be seen in present-day conflicts. As the Ukraine war has recently shown, “Combined-Arms-Warfare” is the single most important capability in a conventional land war of the 21st century. Ukraine has been able to resist and even push back the numerically dominant Russian army, by combining efficient communication, limited but precise hits on strategically important targets, and the combination of state-of-the-art technology with heroic human effort. This is reflected by the market size for autonomous military weapons, which rose to an astonishing 13.30 billion dollars in 2022. With many countries – most recently Japan – on a path to armament unseen in a long time, this trend is probably here to stay.
Whether common sense prevails globally – as it did in San Francisco – remains to be seen.
Further recommended reading:
“Theory of the Drone” – Gregoire Chamayou (2013)