Why military ‘drone swarms’ raise ethical concerns in future wars

|
Gleb Garanich/Reuters
Employees of Swarmer prepare an artificial intelligence-enabled drone for flight, amid Russia's attack on Ukraine, in the Kyiv region, Ukraine, June 27, 2024.
  • Quick Read
  • Deep Read ( 5 Min. )

As researchers apply artificial intelligence and autonomy to lethal aerial machines, their systems pose new questions about how much humans will remain in control of modern combat.

Intelligent “drone swarms” would represent a breakthrough in warfare. Rather than soldiers piloting individual uncrewed vehicles, they could deploy air and seaborne swarms to cooperate on missions “with limited need for human attention and control,” according to a recent U.S. government report.

Why We Wrote This

Artificial intelligence-powered drone technology could eventually change warfare. But the autonomy of lethal machines raises serious ethical dilemmas around how, and whether, to regulate development, deployment, and use of AI.

The question going forward is whether the Pentagon can overcome the many technological challenges of drone warfare while also maintaining the ethics of a democratic state. There are fears that adversaries may exploit their own swarm technology in future conflicts, without ethical constraints.

How much human oversight is necessary or desirable is a key question. Humans, after all, don’t process information as quickly as machines, which may increase pressure to take humans out of the loop in order to stay competitive in battle.

“We need more people thinking about them in the context of the military, in the context of international law, in the context of ethics,” says Margaret E. Kosal, a former science and technology adviser at the Defense Department.

The proliferation of cheap drones in conflicts in Ukraine and the Middle East has sparked a scramble to perfect uncrewed vehicles that can plan and work together on the battlefield. 

These next-generation, intelligent “swarms” would represent a breakthrough in warfare. Rather than soldiers piloting individual uncrewed vehicles, they could deploy air and seaborne swarms on missions “with limited need for human attention and control,” according to a recent U.S. government report. It’s the “holy grail” for the military, says Samuel Bendett, an adviser to the Center for Naval Analysis, a federally funded research and development center. 

It’s also an ethical minefield. As researchers apply artificial intelligence and autonomy to lethal machines, their systems raise the specter of drone armies and pose new questions about the role human control should play in modern combat. And while Pentagon officials have long promised that humans will always be “in the loop” when it comes to decisions to kill, the Defense Department last year updated its guidance to address AI autonomy in weapons. 

Why We Wrote This

Artificial intelligence-powered drone technology could eventually change warfare. But the autonomy of lethal machines raises serious ethical dilemmas around how, and whether, to regulate development, deployment, and use of AI.

“It’s a very high level of approval to even proceed with testing of a fully autonomous weapons system,” says Duane T. Davis, a senior lecturer in the computer science department at the Naval Postgraduate School in Monterey, California. But it does “provide for the possibility of completely autonomous weapons systems.”  

That’s largely because much U.S. military research is driven by fears of how adversaries may exploit their own swarm technology in a future conflict with the United States or its allies. The question going forward is whether the Pentagon can overcome the myriad technological challenges of drone warfare while also maintaining the ethics of a democratic state.

The concern is that China “is not going to wrestle with these same ethical decisions in the way that we will,” says Dr. Davis. 

What makes a swarm

Current instances of uncrewed military group attacks over battlefields – as well as the drone light shows now popping up as entertainment in night skies over the U.S. – are not intelligent swarms. The former are essentially salvos of slow-moving aerial “missiles,” each one operated by a human, with no machine-to-machine coordination or communication. The latter – a high-tech alternative to fireworks – are preprogrammed displays in near-ideal conditions, which aren’t particularly useful in a military setting, since an adversary can figure out how to counter them.

“For an enemy, that just means I’ve got a pattern of things I can shoot at, or they’re operating similarly, so it’s easier to predict what they’re going to do,” notes Bryan Clark, senior fellow at Hudson Institute. 

Swarms instead use an array of sensors to communicate drone to drone – and then switch to AI to plan and collaborate for attacks on the fly. They’re programmed to create a siege of overwhelming force from “a bunch of different angles – the way ants crawl all over a beetle, or whatever, to eat it,” says Zachary Kallenborn, a fellow at George Mason University’s Schar School of Policy and Government.

Gleb Garanich/Reuters
The artificial intelligence-enabled drone from Swarmer flies in the region near Kyiv, Ukraine, June 27, 2024. U.S. defense planners say the use of drones in "swarms" that rely on AI to complete their missions would be a breakthrough that raises ethical questions about reduced human control of combat.

A big challenge for current drone operators on Ukraine’s battlefields is Russian jamming technology, which can prevent operator-drone and, thus, drone-to-drone communication. To address this challenge, some researchers are working on ways for drones to observe and infer what other drones are doing. 

The fog of war complicates visual observation. That’s why Theodore Pavlic of Arizona State University recently began studying weaver ants in Australia at the behest of U.S. Special Operations Command. As the ants swarm and transport their prey up trees, they sense each other’s presence without constantly looking around. 

They also cooperate and make decisions as a team. “If we can replicate that [with drones], you can basically hit go, and they will plan their own way,” says Dr. Pavlic, who also studies stingless bees and other types of ants. “If new challenges occur, then they can [set] temporary short-term goals to get around those challenges.” 

Bang for the buck

Building smart drones, with more onboard intelligence and computing power, means bigger and more expensive machines, and that has a downside. “Computers can only be so small, and you can only put so much power and payload onto a drone,” says Nisar Ahmed, director of the Research and Engineering Center for Unmanned Vehicles at the University of Colorado Boulder. 

Just for a drone to take off, for starters, requires roughly 10 times the energy that a world-class sprinter expends to run a 100-meter race, says Vijay Kumar, dean of the University of Pennsylvania’s engineering school. The result: Missions with aerial drones are currently limited in terms of distance and time. Since longer-range drones are expensive, cheaper drones that can stay aloft for an hour – or even 30 minutes – offer more bang for the buck.

Despite the challenges, researchers are making progress. Red Cat Holdings, a drone technology company in Puerto Rico, announced last year a system in which one person could operate four of its Teal drones, as opposed to today’s 1-1 ratio. The company aims to increase that ratio by pushing even more autonomy onto the machines themselves.

Embedding such autonomy in lethal machines, however, also poses ethical challenges about maintaining human oversight – particularly as the speed and complexity of drone decision-making increases. Humans, after all, don’t process information as quickly as machines, which may increase pressure to take humans out of the loop if, say, China or another adversary deploys AI-equipped drones capable of full autonomy.

The Pentagon hired an ethics officer in 2020 to grapple with precisely such challenges. Still, “We need more people thinking about them in the context of the military, in the context of international law, in the context of ethics,” says Margaret E. Kosal, a professor at the Sam Nunn School of International Affairs at the Georgia Institute of Technology and former science and technology adviser at the Defense Department. 

A machine gun analogy

What is clear is that the technology will continue to develop at breakneck speed, even as researchers wrestle with challenges specific to the battlefield of the day. Drones will change war the way the machine gun did more than a century ago, says George Matus, chief technology officer of Red Cat and founder of its Teal subsidiary. 

“Back then, a handful of gunners could defeat large numbers of even the mightiest cavalry. [Sometimes, even] today, a handful of drones can defeat a battalion of the mightiest armored vehicles before they even reach the front line.” In the future, intelligent swarms will prove even more effective, he adds.

While many researchers worry the technology is one more step toward all-out swarm warfare, Mr. Matus embraces the vision.

“The front line is going to become majority automated, if not fully automated,” he says. “There’s no doubt in my mind at least for the next couple of decades, this is going to be a very large part of the future of war.”

Others see it as an evolutionary step with more limited battlefield applications. “It is not fundamentally going to be a revolution in military affairs,” says Dr. Kosal. “That doesn’t mean we shouldn’t be worried.”

You've read  of  free articles. Subscribe to continue.
Real news can be honest, hopeful, credible, constructive.
What is the Monitor difference? Tackling the tough headlines – with humanity. Listening to sources – with respect. Seeing the story that others are missing by reporting what so often gets overlooked: the values that connect us. That’s Monitor reporting – news that changes how you see the world.

Dear Reader,

About a year ago, I happened upon this statement about the Monitor in the Harvard Business Review – under the charming heading of “do things that don’t interest you”:

“Many things that end up” being meaningful, writes social scientist Joseph Grenny, “have come from conference workshops, articles, or online videos that began as a chore and ended with an insight. My work in Kenya, for example, was heavily influenced by a Christian Science Monitor article I had forced myself to read 10 years earlier. Sometimes, we call things ‘boring’ simply because they lie outside the box we are currently in.”

If you were to come up with a punchline to a joke about the Monitor, that would probably be it. We’re seen as being global, fair, insightful, and perhaps a bit too earnest. We’re the bran muffin of journalism.

But you know what? We change lives. And I’m going to argue that we change lives precisely because we force open that too-small box that most human beings think they live in.

The Monitor is a peculiar little publication that’s hard for the world to figure out. We’re run by a church, but we’re not only for church members and we’re not about converting people. We’re known as being fair even as the world becomes as polarized as at any time since the newspaper’s founding in 1908.

We have a mission beyond circulation, we want to bridge divides. We’re about kicking down the door of thought everywhere and saying, “You are bigger and more capable than you realize. And we can prove it.”

If you’re looking for bran muffin journalism, you can subscribe to the Monitor for $15. You’ll get the Monitor Weekly magazine, the Monitor Daily email, and unlimited access to CSMonitor.com.

QR Code to Why military ‘drone swarms’ raise ethical concerns in future wars
Read this article in
https://www.csmonitor.com/USA/Military/2024/0826/pentagon-drone-swarms-ai-ethics-china-russia
QR Code to Subscription page
Start your subscription today
https://www.csmonitor.com/subscribe