Drone strike
Comment

Why engineers have to be more proactive about engaging with the laws of war

Image credit: Digitalstormcinema/Dreamstime

As autonomous weapons play an increasing role in conflicts around the world, manufacturers can’t afford to sit back and adapt their designs in response to decisions made by the courts.

The war in Ukraine has put allegations of war crimes and the possibility of prosecutions at the International Criminal Court in the headlines. Apart from feelings of horror, this prospect may seem irrelevant to most engineers and computer scientists. It isn’t – the technology sector has lessons to learn especially now that drones are playing a prominent war-changing role.

Warfare is governed by International Humanitarian Law (IHL), also known as the Law of Armed Conflict (LOAC), through treaties and international protocols negotiated at the United Nations. IHL has the premise that wars will happen, but lethal actions can only be taken by identifiable individuals who have authority to act within the law. Consequently, responsibility and hence liability for harm to people or property must always lie with an identifiable individual, using a weapon system which has predictable behaviour.

Armed drones are one example of lethal autonomous weapons (LAWs). Who is legally responsible for the lethal effects of an autonomous weapon?

The UN expert group on LAWs says that they are covered by current IHL. The problem is identifying the human responsible for the lethal decision. The group’s view is that “This should be considered across the entire life cycle of the weapon system”, clearly placing responsibility on technologists in the supply chain, not just military commanders. How can engineers meet their responsibilities?

A significant difference between IHL and national laws for autonomous systems is in Article 35 of Additional Protocol I to the Geneva Conventions. It states that new “means or methods of warfare” are not unlimited in use and that a state must determine that they meet IHL. Article 36 imposes a test regime. Operational use is covered by Rules of Engagement, set after extensive collaborative legal, military and technical work. When the GGE completes its work, all LAWs, including armed drones, must have completed Article 36 reviews covering the whole life cycle.

Engineers are able to meet their IHL responsibilities using systems engineering approaches to include identification of chains of authority to act, and identifying limitations on the weapon’s use. National laws do not seem to have an equivalent overarching requirement for a product to have identifiable responsibility for the consequences of its actions before use except, perhaps, that it must be fit for purpose.

Autonomous systems such as driverless vehicles are one area of engineering where national legal regimes are already struggling to accommodate rapidly evolving technology. New technologies are being introduced into existing legal frameworks that then try to catch up and identify who is responsible for any accident. For example, the English and Scottish Law Commissions’ opinion is that autonomous cars are incompatible with current UK laws.

Autonomy and machine learning, with their non-deterministic behaviours, make change inevitable. System engineering approaches being used for IHL include deriving technical specifications for each system from generic IHL principles, architectural analysis of human responsibility, and limiting subsystem behaviour by design. LAWS require more Article 36 Reviews during concept and design phases, with the aim of reaching agreement between lawyers, engineers and computer scientists about the application of specific new technologies in a LAW. The result is identification of the responsibility for each aspect of a weapon’s behaviour, reducing the technical, legal and financial risk for all parties.

The lesson for producers of any autonomous system is that system specifications should be based on legal limitations regarding a product’s use. These then underpin all design, test, upgrade and disposal phases. It can be done, and will reduce liability risks to manageable levels, However, it requires lawyers, regulators, marketeers, engineers and computer scientists to collaborate closely throughout the whole product lifecycle from concept to disposal.

Should we sit back, do nothing new about our potential liabilities, let the courts decide liabilities and then retrospectively change our designs and design processes to meet their rulings? No! As professional engineers, we should be proactive and initiate change based on solid engineering principles, turning liability into a manageable risk like any other.

Tony Gillespie is a visiting professor in the Electronic & Electrical Engineering Department at UCL and author of the IET book Systems Engineering for Ethical Autonomous Systems. He was joint author of the first paper turning the Geneva Conventions into engineering requirements  and has written several papers on AI, autonomy and responsibilities.

Sign up to the E&T News e-mail to get great stories like this delivered to your inbox every day.

Recent articles