‘Appropriate Levels of Human Judgment’ Are Needed When Dealing With Killer Robots, the Pentagon Demands

By Elena Rodriguez, Marco Trujillo; Edited by News Gate Team

The Pentagon is urging “appropriate levels of human judgment” when dealing with autonomous and artificial intelligence-led weapons. Here’s what that means.
© sarah5 – Getty Images
  • The Department of Defense wants a transparent policy. regarding military use of autonomous and artificial intelligence-led weapons.
  • The new directive is an update a decade in the making, amidst ever-changing technology.
  • The Pentagon is now focused on “appropriate levels of human judgment.”

They are referred to as “autonomous systems” by the US Department of Defense. Some refer to them as “killing robots.” Whatever label you want, the Pentagon has updated its guidance on how it intends to handle autonomous and AI-driven weapons. Additionally, it might not be what the rest of the world desires to see.

In a press release announcing an update to DoD Directive 3000.09, Autonomy in Weapon Systems, Kathleen Hicks, deputy secretary of defense, states, “DoD is committed to developing and employing all weapon systems, including those with autonomous features and functions, in a responsible and lawful manner.”

She continues:

“Given the dramatic advances in technology happening all around us, the update to our Autonomy in Weapon Systems directive will help ensure we remain the global leader of not only developing and deploying new systems, but also safety.”

The update establishes a goal of minimizing the “probability and consequences of failures in autonomous and semi-autonomous weapon systems that could lead to unintended engagements,” according to the Pentagon. The directive takes into account significant technological advancement over the previous ten years.

The United Nations has debated the idea for years, despite calls from some people around the world for a total ban on remote weapons. The key issue is the degree of human contact with the autonomous weapons, together with the worry that they might possibly go rogue. That degree of human interaction merely needs to be “acceptable” for the U.S.

The order stipulates that “autonomous and semi-autonomous weapon systems will be designed to allow commanders and operators to exercise suitable levels of human discretion over the use of force.” Those who allow the use of the systems “shall do so with reasonable care and in compliance with the law of war, applicable treaties, weapon system safety guidelines, and applicable rules of engagement,” according to the DoD.

The DoD directive still permits the employment of systems with AI capabilities as long as they adhere to the DoD’s AI Ethical Principles and the Responsible AI Strategy and Implementation Pathway.

By Elena Rodriguez, Marco Trujillo; Edited by News Gate Team

Leave a Reply

Your email address will not be published. Required fields are marked *

Subscribe for Daily Latest News!
Subscribe Now!
No spam ever, unsubscribe anytime.