Autonomous weapons aren’t science fiction—they’re here and should be of concern, Mason expert says

Zak Kallenborn
Zak Kallenborn

A drone that autonomously attacked soldiers during a civil conflict in Libya last year raises concerns about the global use and spread of such weapons, said Zak Kallenborn, a Policy Fellow at George Mason University’s Schar School of Policy and Government.

According to a United Nations report, the KARGU-2 attack drone contributed to the causalities in the 2020 attack, and the manufacturer, STM Defense, claims the weapon has autonomous attack capabilities aided by artificial intelligence.

Kallenborn said this could be the first time a drone has misidentified a target.

“We don’t know if the weapon was used autonomously to attack people, but we know the weapon can,” Kallenborn said, adding that verification would be difficult. “The facts we agree on make the only real question: Did one KARGU-2 operator decide to use the autonomous operation against humans?”

As a whole, Kallenborn said the incident shows the relative simplicity of autonomous weapons.

“It also shows that advocacy groups like the Campaign to Stop Killer Robots aren’t talking science fiction—the technology, and the concerns about them are here,” he said.

Due to complexities in definitions, Kallenborn said he doesn’t support a broad ban on autonomous weapons, but he added that there are serious risks associated with the technology that need to be discussed, including the best approach to limit them.

“The big question is where this technology heads next,” Kallenborn said. “What happens when multiple autonomous weapons are fused together into a massive drone swarm? What happens when autonomous weapons are given control over traditional weapons of mass destruction [such as] chemical, biological, radiological, and nuclear weapons?”

The tragedy in Libya has helped bring that conversation to the forefront.

“The massive media attention this event received is significant in its own right, even if it’s not a particularly spectacular or even clear example of autonomous weapons usage,” Kallenborn said. “However, if it galvanizes global attention to seriously discuss the risks and opportunities of autonomous weapons, that’s a major development.”

Content for this article was compiled from Kallenborn, his publications, and subsequent media interviews.

Zachary Kallenborn is a Policy Fellow at Mason’s Schar School of Policy and Government and an analyst in horrible ways people kill each other: weapons of mass destruction (WMD), WMD terrorism, and drone swarms. He is also a research affiliate with the Unconventional Weapons and Technology Division at the National Consortium for the Study of Terrorism and Responses to Terrorism (START), headquartered at the University of Maryland; an officially proclaimed U.S. Army “Mad Scientist”; and a national security consultant.

For more information, contact Mariam Aburdeineh at 703-993-9518 or

About George Mason

George Mason University is Virginia’s largest public research university. Located near Washington, D.C., Mason enrolls 39,000 students from 130 countries and all 50 states. Mason has grown rapidly over the past half-century and is recognized for its innovation and entrepreneurship, remarkable diversity and commitment to accessibility. Learn more at