The US Army put out a request to corporations for ideas about how to improve its planned autonomous, artificial intelligence controlled targeting system for tanks.
The Army requested help to enable the Advanced Targeting and Lethality Automated System (ATLAS) to “acquire, identify, and engage targets at least 3X faster than the current manual process.”
The U.S. Army added a disclaimer in a move spotted by news website Defense One. The “Defense Department” is still building killer robots, but they need to follow the military’s “ethical standards.” Without modifying the original wording, the Army added a note that explains “Defense Department” policy regarding killer robots. Autonomous American killer robots still aren’t allowed to go around murdering people yet.
The language added includes the following:
“All development and use of autonomous and semi-autonomous functions in weapon systems, including manned and unmanned platforms, remain subject to the guidelines in the Department of Defense (DoD) Directive 3000.09, which was updated in 2017. Nothing in this notice should be understood to represent a change in DoD policy towards autonomy in weapon systems. All uses of machine learning and artificial intelligence in this program will be evaluated to ensure that they are consistent with DoD legal and ethical standards.”