Developments in Military AI: What it Means and What it Doesn’t

0
348
military AI

Already, it’s been an eventful year for military AI applications.

The U.S. Army announced last month that it will be investing $72 million in basic research in artificial intelligence aimed at discovering new applications that would increase mission effectiveness across the Army, “augmenting Soldiers, optimizing operations, increasing readiness, and reducing casualties.” Coming just a month after the DoD unveiled its overall Artificial Intelligence Strategy, this series of developments are indicative of a broader, department-wide push to make AI a big part of what tomorrow’s military will look like.

In many ways, this is an exciting development for our armed forces. Implementing military AI solutions could make the organizational processes that backstop the military leaner and ultimately benefit the warfighter, with the added benefit that it answers peer competitor states’ deployment of military AI solutions in recent years.

Many observers though, harbor concerns over what the impact of placing some of the systems that our warfighters rely on may be – particularly weapons systems – even partially under the control of artificial intelligence. Does this mean that we’ve started down the road towards the Terminator movies? Or will it be business as usual?

With that contention in mind, here’s what you need to know about what’s included in the DoD’s vision of military AI and what is not.

What Military AI Is

This is part of a larger push that the DoD is making to build AI into more and more of its systems. As it’s outlined in the Artificial Intelligence Strategy, the focus is to automate some of the tedious or repetitive tasks that have historically been done by defense personnel, the hope being that automating some of this work will mean that data is processed more quickly.  That will mean equipment and their operators can react to situations faster, and processes across the infrastructure of the Department will be more streamlined.

In fact, some of the uses that the Army has already piloted – and seen promising results on – include speeding up image definition, image processing, enabling voice and language translation, detection cyber threats, streamlining and automating supply chains, and aiding medical professionals diagnosis cases.

So, while there is still a lot of progress to be made, both in developing the proper policies for their deployment and the fundamental science needed to fully realize these technologies, the extant work that we’ve seen out of our military has shown that AI could be a great assistive tool to defense personnel.

What Military AI Isn’t

One of the big reasons that military AI usage stirs up a lot of reticence is that people are not comfortable with allowing machines to run weapons systems. That is, in fact, a reason why the military does not fully automate weapons systems, instead keeping a human involved in the decision to use deadly force.

Another “grave misperception” of military AI research, according to Air Force Lt. General Jack Shanahan, Director of the DoD’s Joint Artificial Intelligence Center, is that “in a back laboratory somewhere in the basement of a building, [the department] has got [an AI] with free-will….that’s going to roam indiscriminately across the battlefield. We do not.” Instead, AI as we know it would work in much more specific applications, like helping a pilot and his or her ground crew predict their aircraft’s maintenance needs.

Before we go any further, no, DoD development of AI technologies won’t result in the activation of Skynet any time soon.

But the bleed effect of these concerns is that they might be warrantlessly hamstringing military AI’s potential. The problem, assistant secretary of the Army for Acquisitions, Logistics, and Technology (ASAALT) Bruce Jette remarked, is that by not trusting AI control of a lethal weapon, they aren’t realizing the potential gains in weapon system reaction to stimuli, one of the key reasons why this is worth doing in the first place.

“Let’s say you fire a bunch of artillery at me, and I can shoot those rounds down, and you require a man in the loop for every one of the shots,” he said. “There are not enough men to put in the loop to get them done fast enough.”

It is not without warrant to consider the implications of new technologies like AI, particularly when applied to anything in the life-and-death trade that is the military, with the proper moral, ethical, and legal reviews. However, in many ways, it is in the interest of the warfighter and the DoD at large to explore the fullest potential of what AI can do to make our military leaner, faster, and consequently, more able to compete with other technologically advanced adversaries.