Draft conclusions
for inclusion into the 2023 report of the GGE
Status date: 25 April 2023
Conclusions
1. The Group recalled the decision of the Sixth Review Conference in December 2021
and by the High Contracting Parties of the Convention in November 2022 that the work of
the open-ended Group of Governmental Experts related to emerging technologies in the area
of lethal autonomous weapon systems established by Decision 1 of the Fifth Review
Conference as contained in document CCW/CONF.V/10, adhering to the agreed
recommendations contained in document CCW/CONF.V/2, is to continue, to strengthen the
Convention.
2. In accordance with its mandate, and as described in the following paragraphs, in the
context of the objectives and purpose of the Convention, the Group intensified the
consideration of proposals and elaborated, by consensus, possible measures, including taking
into account the example of existing protocols within the Convention, and other options
related to the normative and operational framework on emerging technologies in the area of
lethal autonomous weapons systems, building upon the recommendations and conclusions of
the Group of Governmental Experts related to emerging technologies in the area of lethal
autonomous weapons systems, and bringing in expertise on legal, military, and technological
aspects.
3. In this regard delegations presented and discussed a number of proposals, in addition
to proposals presented in previous years. In accordance with its mandate, on the basis of the
intensified consideration of proposals, the Group elaborated the following elements for
possible measures and other options, without prejudice to additional measures in the future,
related to emerging technologies in the area of lethal autonomous weapon systems:
Characterization of LAWS
Recommendations and conclusions:
The role and impacts of autonomous functions in the identification, selection or engagement
of a target are among the essential characteristics of weapon systems based on emerging
technologies in the area of lethal autonomous weapon systems. Identifying and reaching a
common understanding on the concepts and characteristics of lethal autonomous weapons
systems could aid further consideration of the aspects related to the emerging technologies
in the area of LAWS. (2019 report, 19a and 19b)
4. The rules and principles of IHL apply to all weapons and are, therefore, independent from
the military technology used. Thus, a technology-neutral approach should be applied when
considering the implications of emerging technologies in the area of LAWS.
5. Possible measures and other options should address weapon systems that, once activated,
are able to identify, track, engage and apply force to targets, without human intervention.
Application of International Humanitarian Law (IHL):
Recommendations and conclusions:
International humanitarian law continues to apply fully to the potential development of lethal
autonomous weapons systems. (2019 Report, Annex III, Guiding Principle “a”);
A weapons system based on emerging technologies in the area of lethal autonomous weapons
systems, must not be used if it is of a nature to cause superfluous injury or unnecessary
suffering, or if it is inherently indiscriminate, or is otherwise incapable of being used in
accordance with the requirements and principles of IHL; (2019 Report 17h).
The potential use of weapons systems based on emerging technologies in the area of lethal
autonomous weapons systems must be conducted in accordance with applicable
international law, in particular international humanitarian law and its requirements and
principles, including inter alia distinction, proportionality and precautions in attack. (2019
Report 17a).
The right of parties to an armed conflict to choose methods or means of warfare is not
unlimited. Furthermore, international law, in particular the United Nations Charter and
International Humanitarian Law (IHL) as well as relevant ethical perspectives, should
continue to guide the work of the Group. (2022 Report, 18)
6. The requirements and principles of IHL, including inter alia distinction, proportionality
and precautions in attack, must be respected in the development, deployment and use of
weapon systems based on emerging technologies in the area of lethal autonomous weapon
systems. Such weapons systems must not be developed, deployed or used if their effects in
attacks cannot be anticipated and controlled, in accordance with international humanitarian
law, as required in the circumstances of their use.
7. To ensure compliance with international humanitarian law, including that the effects in
attacks can be anticipated and controlled as required in the circumstances of their use, the use
of a weapons system based on emerging technologies in the area of lethal autonomous
weapons systems must include measures, where appropriate, to:
(a) control, limit, or otherwise affect the types of targets that the system can engage;
(b) control, limit, or otherwise affect the duration, geographical scope, and scale of the
operation of the weapon system, including through the incorporation of self-destruct, self-
deactivation, or self-neutralisation or equivalent mechanisms;
(c) clear procedures for trained human operators to activate or deactivate functions in
weapons systems so as to enhance control or improve decision-making over the use of force,
where necessary to comply with international humanitarian law in the circumstances;
(d) ensure the system is sufficiently predictable, reliable, understandable and explainable,
and traceable.
Human-machine interaction:
Recommendations and conclusions:
Human-machine interaction, may take various forms and be implemented at various stages
of the life cycle of a weapon, should ensure that the potential use of weapons systems based
on emerging technologies in the area of lethal autonomous weapons systems is in compliance
with applicable international law, in particular IHL. In determining the quality and extent of
human-machine interaction, a range of factors should be considered including the
operational context, and the characteristics and capabilities of the weapons system as a
whole. (2019 Report, Annex III, Guiding Principle “c”)
8. Human-machine interaction in the use of lethal autonomous weapon systems must be
consistent with the implementation of the requirements and principles of distinction,
proportionality, and precautions in attack.
9. Those responsible for the use of a weapons system based on emerging technologies in the
area of LAWS must be in a position, where feasible, to control, interrupt or disable the system
or system functions, as necessary to comply with international humanitarian law.
Responsibility and accountability:
Recommendations and conclusions:
Accountability for the use of force in armed conflict must be ensured in accordance with
applicable international law, including through the operation of any emerging weapons
systems within a responsible chain of command and control. (2018 Report 23e)
States, parties to armed conflict and individuals remain at all times responsible for adhering
to their obligations under applicable international law, including IHL. States must also
ensure individual responsibility for the employment of means or methods of warfare
involving the potential use of weapons systems based on emerging technologies in the area
2
of lethal autonomous weapons systems in accordance with their obligations under IHL;
(2019 Report, 17c);
Human responsibility for decisions on the use of weapons systems must be retained since
accountability cannot be transferred to machines. This should be considered across the entire
life cycle of the weapons system. (2019 Report, Annex III, Guiding Principle “b”)
Accountability for developing, deploying and using any emerging weapons system in the
framework of the CCW must be ensured in accordance with applicable international law,
including through the operation of such systems within a responsible chain of human
command and control. (2019 Report, Annex III, Guiding Principle “d”).
Necessary investments in human resources and training should be made in order to comply
with IHL and retain human accountability and responsibility throughout the development
and deployment cycle of emerging technologies (2018 Report 23g).
Every internationally wrongful act of a State, including those potentially involving weapons
systems based on emerging technologies in the area of LAWS entails international
responsibility of that State, in accordance with international law. In addition, States must
comply with international humanitarian law. Humans responsible for the planning and
conducting of attacks must comply with international humanitarian law. (2022 Report, 18)
10. States should ensure accountability over the use of lethal autonomous weapons systems
through, inter alia:
(a) the operation of those systems within a responsible command and control chain;
(b) designing of weapons systems that enables identification of the chain of command so as
to allow for the attribution of responsibility for the consequences of their use to
individuals and States under international law;
(c) adequate training to users on the system's functioning, including limitations of sensors
and circumstances that trigger the application of force;
(d) ensuring individual responsibility through the implementation of relevant laws and
procedures; and
(e) establishing mechanisms for reporting and investigating incidents specific to lethal
autonomous weapons systems that may involve violations of IHL.
Legal reviews, risk mitigation and confidence-building measures
In accordance with States’ obligations under international law, in the study, development,
acquisition, or adoption of a new weapon, means or method of warfare, determination must
be made whether its employment would, in some or all circumstances, be prohibited by
international law. (2019 Report, Annex III, Guiding Principle (e));
Possible good practices in the conduct of legal reviews, at the national level, of a potential
weapons system based on emerging technologies in the area of lethal autonomous weapons
systems to determine if its employment, in light of its intended or expected use, would be
prohibited by the requirements and principles of IHL in all or some circumstances. (2019
Report 18(c));
Legal reviews, at the national level, in the study, development, acquisition or adoption of a
new weapon, means or method of warfare are a useful tool to assess nationally whether
potential weapons systems based on emerging technologies in the area of lethal autonomous
weapons systems would be prohibited by any rule of international law applicable to that State
in all or some circumstances. States are free to independently determine the means to conduct
legal reviews although the voluntary exchange of best practices could be beneficial, bearing
in mind national security considerations or commercial restrictions on proprietary
information. (2019 Report 17(i));
Weapons systems under development, or modification which significantly changes the use of
existing weapons systems, must be reviewed as applicable to ensure compliance with IHL.
(2018 Report 23(c)).
3
In the context of the CCW, delegations raised a diversity of views on potential risks and
challenges posed by emerging technologies in the area of lethal autonomous weapons
systems including in relation to harm to civilians and combatants in armed conflict in
contravention of IHL obligations, exacerbation of regional and international security
dilemmas through arms races and the lowering of the threshold for the use of force.
Proliferation, acquisition and use by terrorists, vulnerability of such systems to hacking and
interference, and the possible undermining of confidence in the civilian uses of related
technologies were also raised. (2019 Report, 25a)
11. Legal reviews of weapon systems based on emerging technologies in the area of lethal
autonomous weapon systems must seek to assess whether they are capable of being used in
conformity with applicable international humanitarian law and international law. Legal
reviews should consider the technical performance, intended use, and the possible tasks and
types of targets. In this context, the exchange of relevant best practices between States could
enhance adequate assessments for compliance.
12. Risk mitigation measures should include mechanisms for the system to be sufficiently
predictable, reliable, understandable and explainable, and traceable, to avoid uncertainty
regarding the anticipated functioning of the weapon system in the operational environment.
13. Risk mitigation measures should also involve consideration of the risks of unintended
engagements and consider controls, limits, or other measures to affect the types of targets the
system can engage. They should also involve reducing automation bias as well as unintended
bias in artificial intelligence capabilities related to the use of the weapon system.
14. States should also consider using risk assessments and mitigation measures during the
design, development, testing, and deployment of weapon systems based on emerging
technologies in the area of lethal autonomous weapon systems. These assessments and
mitigation measures could be integrated into research and development through
interdisciplinary perspectives and include ethics reviews. Moreover, risk assessments and
mitigation measures should also involve consideration of the risks of civilian casualties and
precautions to help minimize the risk of incidental loss of life, injuries to civilians, and
damage to civilian objects.