Pentagon Study Urges ‘Immediate Action’ On Thinking Weapons; VCJCS Selva Cautious
Posted on
WASHINGTON: Should the United States build physical and cyber Terminators, weapons that do not have a human in the loop? The unequivocal answer from the prestigious Defense Science Board is yes.
“This study concluded that DoD must accelerate its exploitation of autonomy—both to realize the potential military value and to remain ahead of adversaries who also will exploit its operational benefits,” the DSB study says. Machines and computers can process much more data much more quickly than can humans, “enabling the U.S. to act inside an adversary’s operations cycle.” And that is why it is “vital if the U.S. is to sustain military advantage.” Ruth David, of the National Science Foundation and coauthor of three books on signal processing algorithms, and retired Air Force Maj. Gen. Paul Nielsen, co-authored the study.
Autonomy and human-machine assistance are, of course, core elements of the Pentagon’s Third Offset Strategy.
Vice Chairman of the Joint Chiefs of Staff, Gen. Paul Selva, repeated his cautious embrace of autonomous weapons today at the Center for Strategic and International Studies. Breaking D readers will remember his use of the wonderful term “Terminator Conundrum” to describe the ethical issues the military faces as it allows weapons to make decisions in battle without a human being. Today, I asked him again if the US should pursue treaty or other international restrictions on the weapons and he didn’t address it directly.
Selva appeared to agree with Frank Kendall, the head of Pentagon acquisition, who worries that enemies will not care as much about the ethical niceties of allowing a robot to kill human beings. He said “there will be violators” of any agreement, as there are with chemical weapons and other banned weapons. Syria and Daesh (known to some as ISIL) have both used chemical weapons this month.
The DSB study addresses this issue of what I think we can call “trust.” Its authors say that, “when something goes wrong, as it will sooner or later, autonomous systems must allow other machine or human teammates to intervene, correct, or terminate actions in a timely and appropriate manner, ensuring directability. Finally, the machine must be auditable—in other words, be able to preserve and communicate an immutable, comprehensible record of the reasoning behind its decisions and actions after the fact.”
But what are the possibilities behind autonomous weapons? The study offers a neat summary:
“Imagine if….
We could covertly deploy networks of smart mines and UUVs to blockade and deny the sea surface, differentiating between fishing vessels and fighting ships…
…and not put U.S. Service personnel or high-value assets at risk.
We had an autonomous system to control rapid-fire exchange of cyber weapons and defenses, including the real-time discovery and exploitation of never-seen-before zero day exploits…
…enabling us to operate inside the “turning radius” of our adversaries.
We had large numbers of small autonomous systems that could covertly enter and persist in denied areas to collect information or disrupt enemy operations…
…a ‘sleeper presence’ on call.
“We had large numbers of low-cost autonomous unmanned aircraft capable of adaptively jamming and disrupting enemy PNT capabilities…
“…destroying their ability to coordinate operations.
“We had autonomous high performance computing engines capable of not only searching ‘big data’ for indicators of WMD proliferation, but of deciding what databases to search…
“…to provide early warning and enable action
“And imagine if we are unprepared to counter such capabilities in the hands of our adversaries.
Thanks to the current Defense Department budget situation, the DSB does not urge any new programs. Instead, it “recommends a set of experiments/prototypes that would demonstrate clear operational value across these operational challenges.
And the military also needs to prepare “to counter autonomy employed by adversaries.” How close are we to autonomous weapons? Close. The Air Force Research Lab has already demonstrated the simulated ability of drones to defeat human pilots, as Breaking D readers know.
As the DSB study notes, commercial and governmental uses of the capability have blossomed. One of the most compelling:
“Footage from the estimated 52,000 government-operated closed circuit television (CCTV) cameras in the United Kingdom, along with the 1.85 million total cameras across the country, is used in as many as 75 percent of the 3.9 million criminal cases annually.”
Selva offered an example of how much the US military needs autonomy with pretty striking similarities to the British example. If the US is to make use of the commercial satellites imagery it expects to get over the next 20 years, it would need eight million imagery analysts to make sense of the data, Selva said at CSIS. That ain’t gonna happen. Big data analytics will come to the rescue.
Subscribe to our newsletter
Promotions, new products and sales. Directly to your inbox.