Red Cross alarmed over 'killer robots'

Red Cross alarmed over 'killer robots'
Photo credit: The Terminator/screenshot

Countries must agree on strict rules on "killer robots" - autonomous weapons which can assassinate without human involvement, a top Red Cross official says.

Semi-autonomous weapons systems from drones to tanks have for decades been used to eliminate targets in modern day warfare - but they all have human control behind them.

With rapid advancements in artificial intelligence, there are fears among humanitarians over its use to develop machines which can independently make the decision about who to kill.

Yves Daccord, director-general of the International Committee of the Red Cross (ICRC), said this would be a critical issue in the coming years.

He says it raises ethical questions on delegating lethal decisions to machines and accountability.

"We will have weapons which fly without being remotely managed by a human and have enough intelligence to locate a target and decide whether it is the right person to take out," Daccord said.

"There will be no human making that decision, it will be the machine deciding - the world will essentially be delegating responsibility to an algorithm to decide who is the enemy and who is not, and who gets to live and who gets to die."

Daccord said autonomous weapons crossed a moral threshold as machines did not have the human characteristics such as compassion necessary to make complex ethical decisions.

They lacked human judgment to evaluate whether an attack was a proportional response; distinguish civilians from combatants, and abide by core principles of international humanitarian law, he added.

But supporters of autonomous weapons argue they will make war more humane.

They will be more precise in determining and eliminating targets, not fall prey to human emotions such as fear or vengeance and will minimise civilian deaths, they say.

Reuters.