It's war but not as we know it
It is the stuff that science fiction thrives on. Just watch 'I, Robot', 'Terminator' or 'Battlestar Galactica' and you will see an anecdote of robots, built by man to fight wars that then turn against humanity.
Even going back as far as the 1921 play that invented the word robot, 'Rossum's Universal Robots', Czech writer Karel Capek spun the parable of mechanical, highly intelligent slaves who mount a revolt and kill all humans but one. Most of us would say brush those concerns off as nightmarish visions dreamt up by robo-phobes. But are we right to do so?
There is one man who fears that these cinematic flights of fancy could one day become a harsh reality, and that is UK robot expert and University of Sheffield professor Noel Sharkey. "They pose a threat to humanity," he declares.
Intelligent machines deployed on battlefields around the world - from mobile grenade launchers to rocket-firing drones - can already identify and lock onto targets without human help.
The first three armed combat robots fitted with large-calibre machine guns deployed to Iraq last summer, manufactured by US arms maker Foster-Miller, proved so successful that 80 more are on order, says Sharkey. But, up to now, a human hand has been required to push the button or pull the trigger.
"Military leaders are quite clear that they want autonomous robots as soon as possible, because they are more cost-effective and give a risk-free war," he says.
South Korea and Israel both deploy armed robot border guards, while China, India, Russia and the UK have all increased the use of military robots.
Washington plans to spend $4bn by 2010 on unmanned technology systems, with total spending expected rise to $24bn, according to the Department of Defence's (DoD) Unmanned Systems Roadmap 2007-2032, released in December 2007.
"It is the department's firm belief that the integration of all the unmanned domains - air, ground and sea - are the future of DoD integrated operations, not only from a systems perspective, but also from a joint-service perspective and, in many cases, a coalition perspective," Dyke Weatherington, deputy director of the Unmanned Aircraft Systems Task Force says.
Drone aircraft and ground-based robots are already proving their worth in Iraq and Afghanistan, Weatherington says. The report discusses those successes while pointing out additional requirements cited by combatant commanders.
Ronald Arkin of Georgia Institute of Technology, who has worked closely with the US military on robotics, agrees that the shift towards autonomy will be gradual.
"Robots in warfare are becoming the standard for the United States military of the future," Arkin says. "A congressional mandate requires that by 2010, one-third of all operational deep-strike aircraft be unmanned and by 2015, one-third of all ground combat vehicles be unmanned.
"While reducing the number of our soldiers on the battlefield seems at first an easy decision, there are many questions related to the viability of this approach. One concerns the issue of lethality, i.e., will intelligent robots be allowed to make decisions regarding the application of lethal force against humans in war without requiring direct human intervention (can a robot pull the trigger on its own)?" Arkin asks. "Can robotic soldiers ultimately be more humane (humane-oids?) than actual warfighters by incorporating a means for ensuring that the laws of war are strictly followed?
"Our laboratory is currently exploring these questions for the Army, while concurrently designing complex, multirobot mission software for the Navy.
He continues: "The real issue is whether the robot is simply a tool of the warfighter, in which case it would seem to answer to the morality of conventional weapons, or whether instead it is an active autonomous agent tasked with making life or death decisions in the battlefield without human intervention. To what standards should a system of this sort be held to, and where does accountability lie?
"Is it possible to endow these systems with a conscience that would reflect the rules of engagement, battlefield protocols such as the Geneva Convention, and other doctrinal aspects that would perhaps make them more humane soldiers than humans? I find this prospect intriguing."
Ban the bots
For Sharkey, the best solution may be an outright ban on autonomous weapons systems. "We have to say where we want to draw the line and what we want to do and then get an international agreement," he says.
Sharkey estimates that there are 4,000 to 6,000 military robots currently deployed in Iraq and Afghanistan. They are being controlled by people - but that could change. Sharkey suggests that in conflicts like these, autonomous robots can't differentiate between civilians and combatants.
"Leaving them with the decision about who to kill would come into conflict with the essential ethical principles of a 'fair war' that were established by the Hague and Geneva Conventions, as well as by other guidelines for the protection of civilians, wounded soldiers, the sick, mentally disabled and prisoners," he explains. "There are no systems that are optically or sensory capable of mastering these challenges."
He goes on to say that further dissemination of armed robots will change the nature of conflicts dramatically. Sharkey points out that this development also proves that there is a need for international guidelines and norms in place for the protection of innocent bystanders.
This question is the premise of the preliminary report 'Autonomous Military Robotics: Risks Ethics and Design' written by researchers of the Ethics and Emerging Technologies Group at California Polytechnic State University.
"There are significant driving forces towards this trend," says Patrick Lin, one of the authors of the report. "The Congress 2010 deadlines apply increasing pressure to develop and deploy robotics, including autonomous vehicles; yet a 'rush to market' increases the risk for inadequate design or programming."
The preliminary report concentrates on the existing semi-autonomous (unmanned) robots used in war as a foundation for questions surrounding the possible risks of using fully autonomous combat robots that would have to make decisions a human would usually make.
According to the report, thousands of robots are being used today in Iraq and Afghanistan in "dull, dirty, and dangerous" jobs. The known military robots being used, the report says, are all semi-autonomous. This means that robots such as the unmanned Predator air drone can go on reconnaissance missions, but need human authority to fire missiles. In this case, Air Force pilots control the Predators from trailers outside Las Vegas.
The report suggests that not only would these robots be beneficial in saving US soldiers' lives, but saving civilians from soldiers that commit atrocities.
"To the extent that military robots can considerably reduce unethical conduct on the battlefield - greatly reducing human and political costs - there is a compelling reason to pursue their development as well as to study their capacity to act ethically," it reads.
"Those of us who work in robotics, now that we are becoming aware of all the ethical issues in addition to the technology, should take it seriously and try to make sure that our students also take them seriously," Lin says, "so that robots are used for the good of humanity."
"We are at a point of revolution in war, like the invention of the atomic bomb," says Peter Singer, military expert and author of the recently published book 'Wired for War'.
"What does it mean to go to war with US soldiers whose hardware is made in China and whose software is made in India?"
The US military has already made great strides in unmanning the battlefield. The US uses attack drones and bomb-handling robots, and custom war video games have been used as recruiting tools.
"You don't have to convince robots they are going to get 72 virgins when they die to get them to blow themselves up," Singer says. "When a robot dies, you don't have to write a letter to its mother," Singer quotes one unit commander as saying.
Rodney Brooks, chief technical officer at iRobot, is more optimistic. The firm takes its name from Isaac Asimov's
'I, Robot', which posits laws that state that robots must never harm humans. The firm also makes the first mass-produced robotic vacuum cleaner. Brooks says there will never be a robot takeover because, by then, people will be part-computer, part-human.
Singer's exhaustively researched book, enlivened by examples from popular culture, ends with a hint that he's also worried.
"We are creating something exciting and new, a technology that might just transform humans' role in their world, perhaps even create a new species," he concludes.
"But this revolution is mainly driven by our inability to move beyond the conflicts that have shaped human history from the very start. Sadly, our machines may not be the only thing wired for war."
New war machine
The company behind the only armed robots in Iraq is rolling out a new model of gun-toting machine, built from the start for combat.
During the early days of the Iraq war, the robotic engineer at QinetiQ's North American division Foster-Miller modified their bomb-disposal machines to have them carry machine guns, grenade launchers or rockets.
After years of safety testing and modifications, three of these deadly SWORDS (special weapons observation remote reconnaissance direct action system) and TALON IV robots were recently sent to Iraq.
SWORDS is armed, remotely controlled, and integrated with features like the Tele-present rapid aiming platform and multiple video cameras. The set-up is interchangeable with multiple weapons but currently can only be mounted with the M249 Squad Automatic Weapon, M240 machine gun, or Barrett .50 calibre rifle for armed reconnaissance missions.
The vehicle weight with the M249 is 196lb (89kg), and the top speed is approximately 5mph (8km/h). The system is also equipped with video cameras of varying intensity for recording, infrared and zoom capabilities to assist in scouting missions.
SWORDS operates in multiple environments including sand, snow and rain. The system is not autonomous; it is manipulated by an operator controlling a small, portable console/terminal known as the operator control system (enclosed in a pelican case) to remotely direct the device and fire its weapons.
This remotely operated system improves the safety of deployed Joint Service EOD (explosive ordinance disposal) and Special Forces units as they conduct reconnaissance, perimeter security operations and surveillance operations.
SWORDS also provides sniper capability and improves the safety of US Forces disembarking from their armoured vehicles during patrols. Ultimately, the robot will extend the standoff distance between US forces and the enemy.
In 2006, SWORDS underwent and successfully completed safety certification testing, an operational assessment and further capability assessments at Aberdeen Proving Ground, Maryland, in preparation for its deployment abroad. SWORDS also underwent user training and evaluation with Special Forces in theatre.
In the summer of 2007, the upgraded SWORDS became the first armed robot to deploy in Iraq. As a result of the success of SWORDS in the field, the Army has solicited a future upgrade for additional robots.
With SWORDS, developers hope that advanced technology operated near-autonomously will act as a deterrent to terrorists who threaten US forces. These systems have the potential to save lives by using advanced telerobotics to move armed forces out of harm's way. Future systems may be a force multiplier to supplement already extensive US forces, not just in theatre but in combat zones all over the world.
The new robot, MAARS (Modular Advanced Armed Robotic System), features new software controls which allow the robot's driver to select fire and no-fire zones. The idea is to keep the robots from accidentally shooting a human. A mechanical range fan also keeps MAARS's gun pointed away from friendly positions.
The robot is also equipped with a GPS transmitter, so it can be seen on - and tap into - the American battlefield mapping programs, just like tanks and Humvees. These Blue Force Trackers have been credited with dramatically reducing friendly-fire incidents during the Iraq war.
MAARS comes with an extra fail-safe, which won't allow it to fire directly at its own control unit.