29 March 2008

Group Protests Autonomous Military Robots

SWORDS (Special Weapons Observation Reconnaissance Detection Systems) robots are equipped with either the M249, machine gun which fires 5.56-millimeter rounds at 750 rounds per minute or the M240, which fires 7.62-millimeter rounds at up to 1,000 per minute.

Anti-landmine campaigners to protest against war robots

London, March 29, 2008 (ANI) -- Reports indicate that an anti-landmine pressure group is going to campaign against military use of armed robots that make their own decisions about when to kill.

According to a report in New Scientist, the concerned group is known as Landmine Action, and is based in London.

This non-governmental organisation wants autonomous robots capable of killing people banned under the same kind of treaty that has outlawed landmines in over 150 countries.

They take the case of m achine-gun wielding military robots, which are currently remotely controlled by soldiers.

But the US Department of Defence wants them in future to work without supervision, meaning they would have to make their own decisions about when to pull the trigger.

Such robots are technologically similar to the latest generation of cluster bombs, against which Landmine Action and others are already campaigning, said Richard Moyes, Landmine Actions director of policy and research.

Those explode in the air, releasing tens of bomblets that descend by parachute.

Each bomblet uses infrared sensors to scan the ground below for heat sources and only detonates on landing if it finds any. If no heat sources are detected, the bomblet explodes high in the air to avoid creating a post-conflict hazard on the ground.

But that decision to detonate is still in the hands of an electronic sensor rather than a person, said Moyes. Our concern is that humans, not sensors, should make targeting decisions. So similarly, we dont want to move towards robots that make decisions about combatants and noncombatants, he added.

We should not use autonomous armed robots unless they can discriminate between combatants and noncombatants. And that will be never, said Noel Sharkey, a roboticist at Sheffield University, UK.

Other experts agree with Sharkey.

According to Peter Kahn, a researcher on social robots from the University of Washington, in Seattle, roboticists should stop taking research funds from the military.

We can say no, he told delegates at a conference on Human-Robot Interaction in Amsterdam, the Netherlands, earlier this month. And if enough of us say it, we can ensure robots are used for peaceful purposes, he added. (ANI)

Source.

Military robots are a threat to humanity.
Military robots violate Asimov's first law.

Only a few posts now show on a page, due to Blogger pagination changes beyond our control.

Please click on 'Older Posts' to continue reading The Rag Blog.