May 12, 2026

From Principles to Protocol: Pakistan’s Case for a Legally Binding Instrument on Lethal Autonomous Weapons Systems

The ongoing debate on Lethal Autonomous Weapons Systems (LAWS) is at the intersection of arms control and international humanitarian law (IHL). These systems, capable of selecting and engaging targets independent of real-time human intervention once activated, test the adequacy of existing arms control and legal frameworks. The international community has been discussing the challenges posed by LAWS and ways to restrain them since 2013, but no legally binding instrument has yet been concluded.

The Group of Governmental Experts (GGE), a multilateral expert forum to study and deliberate on issues related to autonomous weapons, was established under the United Nations Convention on Certain Conventional Weapons (UNCCW) in 2016. An open-ended body with no fixed membership, it allows participation from all High Contracting Parties to the Convention (the states that have ratified and are legally bound by CCW) as well as relevant observers. The CCW has 128 High Contracting Parties. Since 2017, it has met annually for sessions of up to ten days. The participation in it involves around 80-90 states per session. Pakistan has been a consistent participant in all GGE sessions since 2017.

Agreed in 2023, the GGE’s current mandate tasks it with developing a legally binding instrument on LAWS, structured as a dedicated protocol under the CCW. The deadline for concluding this framework is 2026, ahead of the Seventh CCW Review Conference in November this year.

As a party to the CCW, Pakistan has been among the most proactive states in advancing the case for a binding instrument on LAWS. While non-binding measures such as political declarations or codes of conduct may signal intent, they lack enforceability, leaving critical gaps in accountability and compliance. The UN Secretary-General’s 2024 report on LAWS highlights two essential functions of a binding framework. First, to develop international law in response to technological change. Second, to clarify how IHL applies to autonomous systems. The CCW negotiations augment, rather than replace, existing IHL obligations.

Consistent with its long-standing commitment to arms control and disarmament, Pakistan was the first country to call for a ban on autonomous weapons in a 2013 UN Human Rights Council meeting, three years before the GGE was formally established. As President of the CCW’s Fifth Review Conference in 2016, led by Ambassador Tehmina Janjua, it helped secure multilateral support for establishing the Group of Governmental Experts on LAWS, a critical step toward the normative framework Pakistan has long sought. In parallel, Pakistan hosted a side event on the margins of the 2023 UNGA First Committee specifically to address security risks arising from military applications of Artificial Intelligence (AI) and LAWS and to map the normative guardrails needed to address them, a convening role few developing states have taken on.

Advancing its commitment to concrete progress on LAWS, Pakistan submitted a working paper titled “Proposal for an International Legal Instrument on Lethal Autonomous Weapons Systems (LAWS)” in 2023 to the GGE, which laid out a structured legal framework spanning prohibitions, regulations, and accountability mechanisms. In the paper, Pakistan proposed an outright ban on any autonomous weapon system that “takes decisions on the use of force without human control, that cannot distinguish between civilians and combatants, or whose effects cannot be adequately predicted, understood and explained.”

For systems that do not meet the threshold for prohibition, the paper proposed concrete restrictions. Weapons must retain the ability to be interrupted by a human at all stages of operation; targeting parameters must not be altered without explicit human approval; and systems must be designed to operate only in environments where civilians are not present. On accountability, the paper stated humans, rather than machines, remain accountable for the consequences.

Pakistan’s proposal was among a small number of submissions in 2023 to present a fully structured instrument draft, and one of even fewer to do so as a single state rather than as part of a coalition, reflecting a degree of foresight that has since been borne out by the trajectory of the negotiations. Building on 2023 paper, Pakistan’s 2024 submission to the GGE proposed detailed elements for a new instrument to be adopted as Protocol VI of the CCW. It advocated for a functional definition of LAWS, explicit prohibitions on systems operating outside meaningful human control, and binding state obligations for oversight, investigation, and accountability for violations.

Now nearly a decade of discussions has led to tangible convergence, reflected in the Chair’s rolling text, an evolving draft incorporating States’ shared positions. Since 2024, this text has guided deliberations and outlines emerging consensus on key elements, including a working definition of autonomous weapon systems and a two-tier regulatory approach, distinguishing between the prohibition of fully-autonomous systems and the regulation of semi-autonomous systems. The September 2025 joint statement by 42 States, representing roughly one-third of the membership of the CCW, expressed readiness to move toward formal negotiations on the basis of this text.

In recent conflicts, civilian casualty ratios have increased in strikes associated with automated or accelerated targeting processes. For example, by early November 2023, approximately one month into the Gaza conflict, the Israel Defense Forces (IDF) indicated that more than 12,000 targets in Gaza had been identified using the Gospel, an AI-enabled platform that identifies buildings, sites, and other physical structures for targeting. At the peak of operations, the IDF reported conducting strikes against as many as 250 targets per day.

The Lavender, an AI-assisted target-generation platform designed to identify and profile humans as potential targets, flagged individuals who merely shared a name or a device with a known suspect. Whereas a companion AI system, “Where’s Daddy?”, tracked listed individuals in real time and triggered strikes specifically when targets had entered family residences, meaning the system was architecturally designed to produce strikes in civilian structures.

As a result, over 71,000 Palestinians, predominantly women, children, and civilians, were killed by direct Israeli strikes alone. This figure, acknowledged by an Israeli military official and consistent with Gaza Health Ministry data, excludes those who died from starvation or remained buried under rubble. According to the critics, this civilian casualty pattern was predicted. When targeting operates at machine speed, human deliberation becomes ceremonial. Automated targeting, in this view, launders lethal discretion through an algorithmic intermediary that absorbs moral responsibility without possessing it.

Similarly, AI-integrated drone warfare transforms operators from decision-makers into passive observers of autonomous lethality. The Bumblebee drone, a semi-autonomous quadcopter deployed by Ukrainian forces, is reported to achieve a direct-hit rate exceeding 70 per cent through autonomous terminal guidance, with software capable of identifying and highlighting targets, including infantry, vehicles, and fortified positions, often before human operators can visually confirm them. Critically, such systems engage targets at speeds that structurally preclude real-time human verification, rendering any nominal oversight procedurally hollow. This displacement of human judgment is further entrenched by “pixel-lock” technologies, which allow drones to continue tracking and engaging targets even after communication loss, effectively removing the operator from the targeting loop once an attack is initiated.

Against this backdrop, the absence of a legally binding instrument risks widening the regulation gap with existing international law. Despite a decade of sustained deliberations and growing convergence on key issues, the GGE has not yet transitioned to formal treaty negotiations. The CCW’s consensus rule, under which any single state can block progress, has been a persistent structural constraint. According to the Stop Killer Robots-supported Automated Decision Research database, at least twelve States, including the United States, Russia, the United Kingdom, Australia, Japan, and India, do not currently support the negotiation of a legally binding instrument on LAWS.

The concept of meaningful human control, while reflected in the Chair’s rolling text, remains contested in its practical application. Pakistan and Ireland have been among the states arguing most consistently that elaborating this concept is precisely the GGE’s role, given the unprecedented nature of autonomous technologies and the evolving challenges they pose for IHL.

Pakistan’s record in the LAWS negotiations is consistent with its broader posture in disarmament forums. As with its role under the Organization for the Prohibition of Chemical Weapons (OPCW) National Authority Mentorship Programme, where it was among the first countries selected as a Mentor State, Pakistan has engaged in the LAWS process not only as a participant but also as a contributor to its emerging normative framework. Its working papers have introduced concepts that have informed broader negotiating texts. The Seventh CCW Review Conference in November 2026 will be a key test of whether nearly a decade of deliberations can be translated into a binding instrument, a question that hinges on political will as much as technical groundwork.

Facebook
Twitter
LinkedIn
WhatsApp

Never miss any important news. Subscribe to our newsletter.

Leave a Reply

Your email address will not be published. Required fields are marked *