Heyns warned that autonomous killing machines - not yet in use on battlefields - could blur the lines of command in war crimes cases, and said action must be taken before the technology overtook existing laws.
"Time is of the essence. Trying to stop technology is a bit like trying to stop time itself - it moves on," he said.
His report says modern technology enables increasing distance to be put between weapons users and the lethal force they project.
That report is backed by the Campaign to Stop Killer Robots, a coalition of groups including Human Rights Watch, Amnesty International and Handicap International, which is calling for a halt in development of weapons which take the decision to shoot and kill out of human hands.
There has already been heated debate on the ethical implications of pilotless aircraft such as the Predator and Reaper drones, which are controlled from an air force base in Nevada, thousands of kilometres away from the mountains where they unload their bombs and rockets.
But critics say this takes modern warfare too close to the realms of a computer game.
Ground robots now in use include the SGR-1, a robot with a machine gun, which South Korea has installed along its border with North Korea.
While they are also not quite the dead-eyed androids wandering the dystopian landscapes of Ridley Scott's Blade Runner, or the metal killing machines of the Terminator films, they are close enough to send a shiver down many a cinemagoer's spine.
"The biggest problem in robotics is that we've seen too much science fiction," said Rich Walker, managing director of the Shadow Robot Company which researches and develops robotics.
He pointed out that robots are already in use on battlefields, performing vital tasks such as bomb disposal and said some responsive robots were little different to land mines and other booby traps which are set up by humans and respond to stimuli.
"Autonomous robots should be seen as neither a good thing nor a bad thing," he said. "It's the way they are deployed."
- Independent