Wars of the Future Will Be Fought by Robots and Hackers
In March, leading military researchers hailing from institutions like the NSA, DARPA, and the U.S. Naval Academy convened at the University of Maryland to attend a two-day workshop funded by the U.S. Army to discuss what they think ground warfare will look like in 2050.
The participants were instructed to imagine themselves, time traveling as it were, waking up in 2050 “in the middle of an on-going battle” and describe what they envisioned, with the results compiled in a written report.
A large number of themes emerged in the ensuing discussion, some of them—cyborg soldiers, autonomous weapons—already familiar to the general public, and others, such as cognitive modeling of the enemy, more obscure. But one observation was repeated again and again: hackers will play a decisive role in the wars of the future.
Remote-controlled weapons are already widely used in the military—drones being the most notable example—and as artificial intelligence becomes more sophisticated, researchers assume that most of combat operations, including complex tasks like coordination and communication, will be outsourced to machines by 2050.
“The tasks that these agents would perform include filtering information, fact checking, fusion, dynamic access control, and adaptive information dissemination,” states a report on the workshop. “Automated processes will task sensors and alter communications paths and priorities based upon their (machine) understanding of mission intent and context.”
But human soldiers are not expected to be phased out entirely. Researchers predict that a small number of “super-human” cyborgs will still participate in fights on the ground.
Road to the Terminator
However, unlike human soldiers, robots are vulnerable to hacking, and hijacking an enemy’s combat units is a far more efficient tactic than destroying them. The researchers speculate that much of the warfare of the future will consist of duels between hackers trying to take over each others’ machines, and that an arms race between offensive and defensive cybersecurity measures will inevitably occur.
“Among the potential counters for dramatically increased automation of key battlefield processes, including spoofing and denial of service attacks for information-dependent processes,” the report states. “Counter-countermeasures include developing an increased ability to filter out extraneous and unauthenticated messages and a better understanding of how these automated processes work under various stresses and attacks so that they can be made more agile.”
The 20th century saw a dramatic transition in military intelligence where machine gathered data—from radars, photographs taken from spy planes, and satellite data—replaced the sensory data that humans could gather by themselves, and researchers expect the trend to continue deep into the 21st. Because machines often lack common sense that humans have, it’s expected that misinformation tactics that try to mislead the enemy will become increasingly common.
The endgame for robot warfare would be the creation of completely autonomous robots that are more or less independent of their human controllers—a robot that doesn’t need constant instruction from a human controller could turn off communication with the outside world, making it less vulnerable to hacking.
“Countermeasures would involve increasing the agility of individual robots by enabling dynamic repurposing and/or building in an override feature that could be exercised by human controllers,” the report states.
In sum, the report etches a “natural” pathway for how autonomous killer robots could become widely used. They’re simply the necessary consequence of an arms race between different nation–states: automated weapons are superior to manual ones; automated weapons controlled remotely by humans are more hackable; autonomous machines that receive minimal outside input and communication are the least vulnerable to hacks; release the Terminator.
If you think that allowing swarms of autonomous killer machines to roam the earth sounds like a bad idea, you’re not a alone. This week, an open letter signed by tech luminaries like Elon Musk and Steve Wozniak as well as scientists like Stephen Hawkings implored the United Nations to pass restrictions on autonomous weapons, which are likened to a Pandora’s Box that could wreak havoc on the human race if development continues unchecked.