The UK is developing the technology to build armed autonomous drones

8th January 2019 / United Kingdom
The UK is developing the technology to build armed autonomous drones

A new report published by Drone Wars UK reveals that despite a UK government statement that it “does not possess fully autonomous weapons and has no intention of developing them”, the Ministry of Defence (MoD) is actively funding research into technology supporting the development of armed autonomous drones.

 

  • MoD developing ability of a machine to operate with limited, or even no, human control
  • Systems powered by advances in artificial intelligence, machine learning, and advanced computing
  • Development of drones that are able to fly themselves, select, identify, and destroy targets without human intervention.
  • Letting machines ‘off the leash’ and giving them the ability to take life crosses a key ethical and legal Rubicon

 

The study, Off the Leash: The Development of Autonomous Military Drones in the UK, identifies the key technologies influencing the development of future armed drones and looks at current initiatives which are underway in the UK to marry developments in autonomy – the ability of a machine to operate with limited, or even no, human control – with military drone technology. The report maps out the agencies, laboratories, and contractors undertaking research into drones and autonomous weapon technology in support of the Ministry of Defence examines the risks arising from the weaponisation of such technologies and assesses government policy in this area.

 

Military planners have a long history of taking advantage of technological developments to aid war fighting. Aircraft and computers were rapidly identified for their potential to transform warfare, and so it should come as no surprise that new combinations of these technologies are of great interest to the world’s militaries. While drones have become familiar over recent years, the next technological leap – powered by advances in artificial intelligence (AI), machine learning, and advanced computing – is likely to see the development not only of drones that are able to fly themselves and stay aloft for extended periods, but those which may also be able to select, identify, and destroy targets without human intervention.

‘Off the Leash’ concludes that drones are likely to be the military technology which develops into the first truly autonomous weapon. The incremental way in which drone technology is developing, and the ability to ‘bolt on’ new features, means that drones are ideally suited to morph into a future generation of autonomous weapon systems.

 

Human Control

The extent to which autonomy within a drone is a concern depends upon the level of human control over the targeting and launch of weapons and the use of force. Although existing armed drones have a degree of autonomy in some of their functions – for instance in relation to flight control – at present human control is maintained over the use of force, and so today’s armed drones do not qualify as fully autonomous weapons.

Many question whether weapons with the capability to make autonomous targeting decisions would ever be able to comply with the laws of war, and make the complex and subjective judgements needed to ensure that the use of force was necessary, proportional, and undertaken so as to avoid civilian casualties.

 

SafeSubcribe/Instant Unsubscribe - One Email, Every Sunday Morning - So You Miss Nothing - That's It


Elements of concern: autonomy and the critical functions of an armed drone

 

Despite this, the Ministry of Defence (MoD) sees autonomous technology and data science as “key enablers” for the future. MoD’s Defence Science and Technology Laboratory (DSTL) and its Defence and Security Accelerator programme have extensive research programmes in these fields. The Engineering and Physical Sciences Research Council is also a significant funder of research in these areas and a number of universities are working on autonomous technology programmes with military applications, often in collaboration with private sector military contractors.

However, investment and innovation in artificial intelligence is being led by the civil sector and not by the world’s militaries. Autonomous technologies originating in the civil sector but adapted for military applications are likely to become key components of the autonomous drones and weapons of the future. Military planners are aware of the civil sector’s lead in developing artificial intelligence and autonomous systems and are keen to have a slice of the cake.

The military technology research sector is smaller than its civil counterpart and has fewer resources but is in a position to adapt existing military systems and anticipate military needs. BAE Systems, for example, has built a family of autonomous experimental drones including ‘Taranis’, an advanced prototype autonomous stealth drone, and has an active AI research and development programme. Qinetiq, Thales Group, and a number of other military aerospace contractors have also participated in autonomous technology development projects for the MoD.

 

Sleight of hand

Current MoD policy states that the UK opposes the development of autonomous weapon systems and has “no intention of developing them”. However, the Ministry of Defence has been accused of a sleight of hand by defining autonomous weapons systems differently from other governments and institutions. Its futuristic definition of autonomous weapons places no limits on the development of autonomous technology to meet short- and medium-term military needs. The claim that “the UK opposes the development of armed autonomous systems” also appears to be at odds with the government’s position in international arms control negotiations. Since 2015 the UK has declined to support moves at the United Nations Convention on Certain Conventional Weapons aimed at banning autonomous weapon systems.

 

Drone Wars UK is clear that the development and deployment of lethal autonomous drones would give rise to a number of grave risks, primarily the loss of humanity and compassion on the battlefield. Letting machines ‘off the leash’ and giving them the ability to take life crosses a key ethical and legal Rubicon.

 

Autonomous lethal drones would simply lack human judgement and other qualities that are necessary to make complex ethical choices on a dynamic battlefield, to distinguish adequately between soldiers and civilians, and to evaluate the proportionality of an attack. Other risks from the deployment of autonomous weapons include unpredictable behaviour, loss of control, ‘normal’ accidents, and misuse.

As a nation which considers itself a responsible and leading member of the international community, the United Kingdom has a duty to use its influence and powers to ensure that the weapons of the future are never used outside boundaries set by the’ laws of humanity and the requirements of the public conscience’, as stipulated in the ‘Martens Clause’ which sets out guiding humanitarian principles for the conduct of war.

 

Recommendations

We consider that the government should support the introduction of an arms control regime to prevent the development, acquisition, deployment, and use of fully autonomous weapons. To support this, the UK should make an unequivocal statement that it is unacceptable for machines to control, determine, or decide upon the application of force in armed conflict and give a binding political commitment that the UK would never use fully autonomous weapon systems.

In order to increase the transparency of research which might lead to the development of autonomous weapons, the government should publish an annual report identifying research it has funded in the area of military autonomous technology and artificial intelligence.

Despite its military applications, AI also has enormous potential for reducing armed conflict, and so the government should fund a wide-ranging study into the use of artificial intelligence to support conflict resolution and promote sustainable security.

Much more deliberation is needed on the ethics and risks associated with disruptive new technologies such as AI, autonomous technology, biotechnology, and nanotechnology. MPs and Peers should investigate the impact of emerging military technologies, including autonomy and artificial intelligence, and press the government to adopt an ethical framework governing future work in these areas. A broader public debate on the ethics and future use of artificial intelligence and autonomous technologies, particularly their military applications, is long overdue. It’s time for the government and civil society to start this debate now – before pressing further ahead with research on robots that can kill humans.

 

 

 

At a time when reporting the truth is critical, your support is essential in protecting it.
Find out how

The European Financial Review

European financial review Logo

The European Financial Review is the leading financial intelligence magazine read widely by financial experts and the wider business community.