Killer drones deciding on their own who lives and who dies might sound like a sci-fi dystopia, but it is already happening.
A recent UN report says militia in Libya, fighting against the internationally recognised government in Tripoli, were "hunted down and remotely engaged" by drones during a battle in March last year.
While drone warfare dates back to the 1990s, they've always been controlled afar by human beings. The difference in the Libyan incident is that the Turkish-made drones "were programmed to attack targets without requiring data connectivity between the operator and the munition".
In other words they weren't being controlled by people, "a true 'fire, forget and find' capability".
The drone attack was mentioned in a 548-page report written by the UN's Panel of Experts on Libya, released in March. It went unnoticed until earlier this week, when New Scientist picked up on the intriguing detail.
It's not clear from the report whether anyone was actually killed by the truly autonomous drones, which were definitely used to jam enemy communications and prevent them from using their own drones.
"The unmanned combat aerial vehicles and the small drone intelligence, surveillance and reconnaissance capability of HAF were neutralised by electronic jamming from the Koral electronic warfare system," the report said, referring to Turkish jamming technology that's also reportedly been deployed with great success in Syria.
According to a video showing off its capabilities, the only input a human has into the Kargu-2 drone's operations is punching in its coordinates.
"The drone will travel to those coordinates, identify likely 'targets', and execute a dive maneuver, swooping down on the target and blowing itself up as it detonates a shotgun-like explosive package," Popular Mechanics reported, summarising the video.
The UN report says the Turkish drones were instrumental in the government's victory over the militia, but broke a "totally ineffective" arms embargo on the north African nation.
"Remote air technology, combined with an effective fusion intelligence and intelligence, surveillance and reconnaissance capability, turned the tide... in what had previously been a low-intensity, low-technology conflict in which casualty avoidance and force protection were a priority for both parties."
The Campaign to Stop Killer Robots, an international effort to ban weapons that choose who to kill without human input, said it was "unacceptable".
Countries "must act in the interest of humanity by negotiating a new international treaty to ban fully autonomous weapons and retain meaningful human control over the use of force", founder Mary Wareham told the New York Times.
Another expert, freelance researcher and analyst Zachary Kallenborn, said it wasn't surprising the UN report was so vague.
"The first use of autonomous weapons in war won't be heralded with a giant fireball in the sky and dark words on how humanity has become Death, Destroyer of Worlds," he told Popular Mechanics, quoting a famous line from the ancient Hindu text Bhagavad Gita made famous by J Robert Oppenheimer, one of the inventors of the atomic bomb.
"First use of autonomous weapons may just look like an ordinary drone. The event illustrates a key challenge in any attempt to regulate or ban autonomous weapons: how can we be sure they were even used?"