The survey found that of 13,410 respondents, 60% think that the armed autonomous drones should not be banned, while 40% think that they should be banned from use.

Armed autonomous drones have come under increasing scrutiny in recent years, as artificial intelligence and unmanned aerial vehicle technology continues to develop to a point where systems could soon be able to execute lethal missions without human intervention.

Our poll asked ‘Should fully-autonomous armed aerial drones be banned?’ to which 8,000 people responded ‘No’ and 5,410 voted ‘Yes’.

Talks to regulate the use of autonomous lethal weapons systems have often stagnated, including, as recently as in November last year, when UN talks on regulations faltered. In March last year, governments of the UK, US, Russia, Israel and Australia opposed instituting a ban on the use of the autonomous weapons.

One important debate around the use of the systems has been the question of whether a human should be in the loop, to oversee the decisions made by the autonomous decision, and approve any lethal force.

Last November, US Secretary of Defence Mark Esper used a speech at the National Security Commission on Artificial Intelligence public conference to warn that the US had observed China offering the sale of fully-autonomous lethal drones in the Middle East and Africa. The UAV, the Ziyan Blowfish A2, is capable of executing fully-autonomous weapons and can be armed with a weapons payload according to its manufacturer.

How well do you really know your competitors?

Access the most comprehensive Company Profiles on the market, powered by GlobalData. Save hours of research. Gain competitive edge.

Company Profile – free sample

Thank you!

Your download email will arrive shortly

Not ready to buy yet? Download a free sample

We are confident about the unique quality of our Company Profiles. However, we want you to make the most beneficial decision for your business, so we offer a free sample that you can download by submitting the below form

By GlobalData
Visit our Privacy Policy for more information about our services, how we may use, process and share your personal data, including information of your rights in respect of your personal data and how you can unsubscribe from future marketing communications. Our services are intended for corporate subscribers and you warrant that the email address submitted is your corporate email address.

According to the UK Ministry of Defence (MOD) the two requirements for a weapon to be classified as an autonomous weapons system the system must be “self-aware and their response to inputs indistinguishable from, or even superior to, that of a manned aircraft. As such, they must be capable of achieving the same level of situational understanding as a human.”

The other MOD requirement is: “Capable of understanding higher-level intent and direction. From this understanding and its perception of its environment, such a system is able to take appropriate action to bring about a desired state. It is capable of deciding a course of action, from a number of alternatives.”

In a document prepares for the House of Lords Defence Committee, professor of AI and robotics and chair of the International Committee for Robot Arms Control Noel Sharkey wrote: “These machines are unlikely to exist in the near future if ever. As the MOD correctly points out, ‘machines with the ability to understand higher-level intent, being capable of deciding a course of action without depending on human oversight and control currently do not exist and are unlikely in the near future’.

“Such ‘science fiction’ requirements can misdirect the UK into inferences such as, ‘since they are unlikely to exist in the near future, we do not need to consider their impact on the nature of armed conflict or consider prohibiting or regulating them.’ Others define AWS in a realistic way that is consistent with new developments in weaponry in the hi-tech nations including the UK.”