View all newsletters
Receive our newsletter – data, insights and analysis delivered to you
  1. Analysis
April 2, 2020updated 28 Jun 2022 2:55pm

Fully-autonomous armed aerial drones should not be banned: Survey

An Air Force Technology poll has found that 60% of readers think fully-autonomous armed aerial drones should not be banned.

By Harry Lye

The survey found that of 13,410 respondents, 60% think that the armed autonomous drones should not be banned, while 40% think that they should be banned from use.

Armed autonomous drones have come under increasing scrutiny in recent years, as artificial intelligence and unmanned aerial vehicle technology continues to develop to a point where systems could soon be able to execute lethal missions without human intervention.

Our poll asked ‘Should fully-autonomous armed aerial drones be banned?’ to which 8,000 people responded ‘No’ and 5,410 voted ‘Yes’.

Talks to regulate the use of autonomous lethal weapons systems have often stagnated, including, as recently as in November last year, when UN talks on regulations faltered. In March last year, governments of the UK, US, Russia, Israel and Australia opposed instituting a ban on the use of the autonomous weapons.

One important debate around the use of the systems has been the question of whether a human should be in the loop, to oversee the decisions made by the autonomous decision, and approve any lethal force.

Last November, US Secretary of Defence Mark Esper used a speech at the National Security Commission on Artificial Intelligence public conference to warn that the US had observed China offering the sale of fully-autonomous lethal drones in the Middle East and Africa. The UAV, the Ziyan Blowfish A2, is capable of executing fully-autonomous weapons and can be armed with a weapons payload according to its manufacturer.

According to the UK Ministry of Defence (MOD) the two requirements for a weapon to be classified as an autonomous weapons system the system must be “self-aware and their response to inputs indistinguishable from, or even superior to, that of a manned aircraft. As such, they must be capable of achieving the same level of situational understanding as a human.”

The other MOD requirement is: “Capable of understanding higher-level intent and direction. From this understanding and its perception of its environment, such a system is able to take appropriate action to bring about a desired state. It is capable of deciding a course of action, from a number of alternatives.”

In a document prepares for the House of Lords Defence Committee, professor of AI and robotics and chair of the International Committee for Robot Arms Control Noel Sharkey wrote: “These machines are unlikely to exist in the near future if ever. As the MOD correctly points out, ‘machines with the ability to understand higher-level intent, being capable of deciding a course of action without depending on human oversight and control currently do not exist and are unlikely in the near future’.

“Such ‘science fiction’ requirements can misdirect the UK into inferences such as, ‘since they are unlikely to exist in the near future, we do not need to consider their impact on the nature of armed conflict or consider prohibiting or regulating them.’ Others define AWS in a realistic way that is consistent with new developments in weaponry in the hi-tech nations including the UK.”

Related Companies

NEWSLETTER Sign up Tick the boxes of the newsletters you would like to receive. The top stories of the day delivered to you every weekday. A weekly roundup of the latest news and analysis, sent every Monday. The defence industry's most comprehensive news and information delivered every month.
I consent to GlobalData UK Limited collecting my details provided via this form in accordance with the Privacy Policy


Thank you for subscribing to Airforce Technology