Robots in war: Ethical concern, or a help for social ills?

Two new reports – one from the US Army's Research Laboratory and the other from the New America Foundation – look ahead to the role of robots and drones and the ramifications of their increasing use.

An MQ-1B Predator from the 46th Expeditionary Reconnaissance Squadron takes off from Balad Air Base in Iraq, June 12, 2008.

U.S. Air Force photo by Senior Airman Julianne Showalter/Reuters

July 28, 2015

In the year 2050, robots will be “ubiquitous” on the American military battlefield. So, too, will force fields and “super human” soldiers, predicts a new report out from the US Army’s esteemed Research Laboratory.

There will be drones, too, that will make use of “significantly greater capabilities of machine reasoning and intelligent autonomy than those existing today,” the report adds with the sort of upbeat note that the average reader could find a bit unsettling.

But while concerns about privacy – or, say, visions of a dystopian abyss – tend to go hand-in-hand with any discussion of drone proliferation, another new report by a respected Washington think tank finds that the vast quantities of information that these unmanned vehicles are able to gather could potentially be used “to improve the quantity and character” of basic human rights.

Why many in Ukraine oppose a ‘land for peace’ formula to end the war

These include property rights, as well as the rights of people in far-flung areas of the globe.

It seems clear, at least to Army warfare futurists, that drones and robots created for military use will be everywhere in 35 years, and will come in a vast variety of forms, from “insect-size entities” to large vehicles capable of transporting a platoon of soldiers.

Part of what the US military most wants to see from these robots is an ability to act and cooperate in teams, the goal of much research currently being conducted in Pentagon-sponsored labs throughout the country. For the insect-size robots, commanders would very much like to have the ability to send “swarms” of small drones out against enemy forces. Indeed, the report predicts that some robots will “operate in teams, like wolf packs.”

Humans will be part of the mix, too, notes the Army Research Laboratory report, “Visualizing the Tactical Ground Battlefield in the Year 2050,” whose authors believe that the “principal Army unit operating in 2050 will be mixed human-robot teams.” 

To enable humans to partner with robots, human members may be “enhanced” in a variety of ways, often through genetic engineering that may “feature exoskeletons” and “possess a variety of implants.” These implants could in turn allow “seamless access to sensing and cognitive enhancements.” 

Howard University hoped to make history. Now it’s ready for a different role.

While this sort of thing may be a dream-come-true nirvana for some military planners, it may seem, mildly put, unsettling to average citizens who wonder about the safety and ethics of “upgrading” soldiers’ bodies, as well as the sort of impact could these innovations have on society at large.

One major concern is that the reams of video collected by unmanned aircraft systems could be used against private citizens. In February, the White House issued a memorandum requiring agencies to ensure by Sept. 30 that policies are in place that would, among other things, “prohibit the collection, use, retention, or dissemination of data in any manner that would violate the First Amendment or in any manner that would discriminate against persons based upon their ethnicity, race, gender, national origin, religion, sexual orientation, or gender identity, in violation of law.”

A report from the New America Foundation endeavors to inject some antidote to alarm into the mix, arguing that the proliferation of drones, for one, could be a net plus for society if used in a civic spirit of enlightenment. 

Unmanned aerial vehicles, for example, “are able to gather large amounts of information cheaply and efficiently,” and these images and maps, say, can be used by communities “to improve the quality and character” of natural resources and human rights, the report argues, suggesting that drone surveillance could potentially “help conservationists protect endangered wildlife” or be “used by advocates and analysts to document and deter human rights violations.” The report notes that it does not address the use of armed drones by governments such as the United States.

The lack of these basic rights “is in part a consequence of political and social breakdowns” that are often “driven by informational deficits,” the report notes. The information transmitted by drones “can chip away at these deficits.”

Indeed, the report concludes, despite understandable privacy concerns created by the proliferation of UAVs, “There is great hope that drones, with the new capabilities they provide, might help protect the most vulnerable among us when their human rights are jeopardized.” The question, of course, is whether political leaders will use them as a tool for good.

The tricky issue of technological or biological enhancements is something military experts and analysts have been wrestling with for years. While exoskeletons or implants of the future may allow formerly unprecedented capabilities, efforts to use the technology of the day or medicine to enhance the human body date back decades, if not centuries.

“The use of human enhancement technologies by the military is not new. Broadly construed, vaccinations could count as an enhancement of the human immune system, and this would place the first instance of military human enhancement (as opposed to mere tool-use) at our very first war, the American Revolutionary War,” wrote Patrick Lin, coauthor of “Enhanced Warfighters: Risk, Ethics, and Policy,” in The Atlantic in 2012, referring to George Washington’s mandate that troops be vaccinated against smallpox, which the British were suspected of using as a biological weapon.

Caffeine and amphetamines, which are potentially addictive, also have been used to enhance soldiers’ endurance for decades.

But the ethical challenges that need to be addressed are many, writes retired Col. Dave Shunk in “Ethics and the Enhanced Soldier of the Near Future.”

Among the questions: “Do enhanced fighters have to give their consent for any type of enhancement? If so, how much consent? Can a warfighter refuse enhancement based on ethical grounds such as religious beliefs? Are there limits to who should be enhanced?” he writes. “Can service members keep their enhancements after leaving the service? What are the consequences when enhanced soldiers return to civilian life? What are the side effects and unintended consequences of enhancement? What are the long-term effects on the mental, emotional, and physical health of the enhanced soldier?”

And, in the case of experimental enhancements or those that pose long-term health risks, could they violate the basic rights of soldiers by “inhibiting their prospects for leading a normal life” after their term of service is over?