Should robots be allowed to take a human life, without direct supervision or command?
Science fiction met reality at the United Nations in Geneva overnight, where this question was debated at a meeting of the Human Rights Council.
UN special rapporteur Christof Heyns told the council that countries are developing armed robots that can kill without the need for human choice or intervention, and they need to call a halt before it's too late.
"The possible introduction of LARs (lethal autonomous robots) raises far-reaching concerns about the protection of life during war and peace," Mr Heyns said. "If this is done, machines and not humans, will take the decision on who is alive or dies."
Mr Heyns presented a report on his research and called for a worldwide moratorium on the production and deployment of such machines, while nations figured out the knotty legal and ethical issues.
"War without reflection is mechanical slaughter," he said. "In the same way that the taking of any human life deserves - as a minimum - some deliberation, a decision to allow machines to be deployed to kill human beings deserves a collective pause worldwide."
Mr Heyns warned that if humans are taken "out of the loop" then it could make war more likely.
It was also unclear how these killer robots could be programmed to distinguish the enemy from innocent civilians.
And because they lacked the ability to act "out of compassion or grace" and understand the bigger picture, a robot would never decide that some specific situations required greater leniency, even in wartime.
In his report, Mr Heyns said robots will be "the next major revolution in military affairs, on par with the introduction of gunpowder and nuclear bombs"
Officially, governments who are capable of producing Lethal Autonomous Robots are not currently planning to use them.
Some argue that, "as a matter of principle, robots should not be granted the power to decide who should live and die," the report said – though others say that, used well, they could "even make armed conflict more humane and save lives on all sides".
Mr Heyns acknowledged that future generations of robots could be able to employ less lethal force, causing fewer unnecessary deaths, with their greater ability to immobilise or disarm a target.
"LARs will not be susceptible to some of the human shortcomings that may undermine the protection of life," his report said. "Typically they would not act out of revenge, panic, anger, spite, prejudice or fear.
"Robots also do not rape."
During the debate Pakistan's council delegate Mariam Aftab – speaking on behalf of 56 Islamic states – said the international community should consider a complete ban, not just national moratoria.
Lethal autonomous robots would fundamentally change the nature of war, she said.
Pakistan has been the focus for anti-terrorism drone strikes. "The experience with drones shows that once such weapons are in use, it is impossible to stop them," said Ms Aftab.
Most of the delegates said they found the report interesting and worthy of further debate, though several said it would be better negotiated outside of a human rights forum.
The European Union delegate said the question would be more appropriately dealt with by arms control negotiations between states. Germany supported the idea of an international register for all unmanned systems.
Argentina warned at a potential killer robot arms race, and possible use by terrorists.
The US delegate pointed out that some systems, such as the Aegis and Patriot surface-to-air missile defence systems, already have an "autonomous mode" that acts when a split-second response is needed.
Last November the USA Department of Defense issued a policy directive for autonomous weapon systems, highlighting technical dangers such as "unintended engagements" (ie, killing the wrong person) and "loss of control of the system to unauthorised parties" (ie, enemies hacking your robots and turning them against you).
France said the "role of humans in the decision to fire must be retained," however the UK said existing law and treaties were sufficient to govern lethal autonomous robotics.
Russia said such machines could "undermine legal order" but did not comment on the report's recommendations.
No 'killer robots' as such are yet known to exist, but precursor technology is already used in the US, UK, Israel and South Korea – and possibly in Russia and China.
Unmanned drones have their weapon systems controlled remotely by humans.
As the potential for autonomous weapons has grown, several organisations have started arguing for a ban or moratorium.
Last year Human Rights Watch issued a report on "Losing Humanity: the case against killer robots".
HRW employee Mary Wareham is co-ordinating the Campaign to Stop Killer Robots. She said this was a "day of firsts", including the first time governments have publicly discussed the issue.
"People have been concerned about this for quite a while now and it's come to fruition ... and it's had a really excellent response," she said. "One of our fears was that they would say 'why are we discussing this, is it really a problem'. But nobody said that. Many were asking how are we taking this forward, who's going to take this forward."
HRW will now campaign for governments including Australia, New Zealand and Canada, who did not take part in the debate, to make their position clear.
"There is a debate going on between the technology people and the more traditional warriors, and it reflects an unease with the trend towards autonomy in warfare," Ms Wareham said. "There are quite a few military who are not happy about this."
Some country needed to "champion" the issue on the world stage, to move towards an international treaty, she said.