Defense Secretary Jim Mattis can get ticked off at times over stilted military terminology and its blizzard of unfathomable acronyms. Last week, he let loose at one of his pet peeves.
In this case, it was "Unmanned Aerial Vehicles," or UAVs, which is what the Defense Department calls drones.
Mattis didn't propose an alternative, or say he was going to order a name change. He just wanted it known that he doesn't like the term.
"One of the most misnamed weapons in our system is the unmanned aerial vehicle. It may not have a person in the cockpit, but there's someone flying it," Mattis told reporters traveling with him on the trip back from a series of security conferences in Europe.
In addition to the person on the joystick flying the thing, "there's someone over their shoulder. There's actually more people probably flying it than a manned airplane," he said.
Then "there's all these people taking the downloads from it. There's people deciding to load bombs on it or not, or ISR cameras, surveillance cameras, on it," Mattis said.
Bottom line: "It's not unmanned," he said.
Mattis went off on UAVs in the middle of a riff on artificial intelligence (AI), and all those thinking and learning robots in movies causing mayhem, and what it could mean for the future of warfare if the stuff actually works.
The SecDef wouldn't make a prediction, but said he doubts humans could ever be totally removed from the AI equation when it comes to war. He went back to basics:
"The fundamental nature of war is almost like H20, OK, and you know what it is. It's equipment, technology, courage, competence, integration of capabilities, fear, cowardice -- all these things mixed together into a very fundamentally unpredictable, fundamental nature of war," he said.
"The character of war changes all the time," Mattis added. "An old dead German [Carl von Clausewitz] called it a chameleon, OK? He said it changes to adapt to its time, to the technology, to terrain -- all these questions" that have to be considered and answered before AI can be introduced on the battlefield.
So what happens when it reaches the point "where machines learn and adapt quickly, where they're somewhat just released" against an enemy?
"I don't know right now, because at some point there's going to be a human who does something -- even if it's nothing more than open the garage door and let them out," Mattis said. Still, the concept of AI is unsettling, he said. "If we ever get to the point where it's completely on automatic pilot and we're all spectators, then it's no longer serving a political purpose. And conflict is a social problem that needs social solutions -- people, human solutions, OK?"
Mattis said he doesn't have any answers right now on AI and war, "but I'm certainly questioning my original premise that the fundamental nature of war will not change. I just don't have the answers yet."
-- Richard Sisk can be reached at Richard.Sisk@Military.com.