Run for cover! Robots may become self-governing devices with built-in firearms in massive numbers worldwide. Yet, robots’ picking who to destroy on the battlefield is a recipe for disaster. Killer robots, flying robots… Three specialists warn the Voice of Russia of what robots with self-determining weaponry would mean. The End of the World is nearing? Autopilot is a built-in feature on many machines, but not with ones which are capable of firing off weapons. “Here’s an example of a killer robot, a flying robot. Go to GPS coordinates X and Y, if you detect a heat signature there, release your weapon. These are really stupid robots and that’s what’s scary about it because one thing is, they use artificial intelligence but there’s no way for them to discriminate against a military combatant or an insurgent and a civilian,” explained Noel Sharkey, Professor of Artificial Intelligence and Robotics at the University of Sheffield. The trust of our nation would be in the hands of machines who only may know if A then do B type commands. Nevertheless, if they are so primitive in their actions, why are militaries lured into learning how to build them with precision? One clear reason is to decrease the amount of troops they’d have to send into a combat zone, evidently decreasing the death rate. “They’ve developed a plane called the Falcon HTV-2 that’s undergoing testing at the moment, it’s a totally unmanned combat plane and they tested it at 13 thousand miles per hour. So that’s not supersonic, that’s hypersonic. You can’t have a pilot in that plane at that speed. It would rip them to pieces with the g-forces. At that kind of speed, it would be very difficult for humans to be watching and seeing what’s going on and having any kind of say or control,” said Sharkey to the Voice of Russia, who’s also the chairman of the International Committee for Robot Arms Control (ICRAC). The US Department of Defense (DoD) recently released a report titledAutonomy in Weapon Systems and touched on the important details of both autonomous and semi-autonomous weapons. And yet, the missteps which could incur are listed at the very end of the report. Failures include but are not limited to hacking into the system, jamming, decoys, and spoofing. So even though the technology we have today could indefinitely lead to the autonomous armed robots of tomorrow, they come with their own set of unpredictable flaws. “Software today is extremely complicated and it behaves in ways people don’t expect. This is another aspect of the problem because even if we think we understand how a system behaves while it’s by itself, it then interacts with the other system. And we don’t know what the other system‘s going to do, therefore we don’t know what our system will do when it sees the other system behaving in a way that it didn’t expect,” told Mark Avrum Gubrud PhD, a physicist who’s been hired to research and write about this issue by Princeton University’s Program on Science and Global Security. Bearing in mind what the implications of autonomous weaponry can do, it’s disturbing that the final decision is left in the hands of militaries and wealthy corporations. Fortunately, Human Rights Watch joined with the Harvard Law School International Human Rights Clinic, and released a 50 page report called Losing Humanity: The Case Against Killer Robots. Human Rights Watch and the International Human Rights Clinic are asking for an international treaty against the use, production, and development of fully autonomous weapons. In addition, they’d like for nations to put policies and laws in effect which would prevent uncontrollable, armed robots from becoming operable. “We don’t think machines, however high tech they get, will be able to follow international law designed to protect civilians in war, we think they would undermine non-legal checks for example they don’t have compassion, which is a significant check on killing of civilians. And it’s also very difficult to hold fully autonomous weapons, killer robots, accountable for their actions—so it creates an accountability gap which can undermine deterrence,” stressed Bonnie Docherty Senior Researcher in the Arms Division for Human Rights Watch, to the Voice of Russia. As a society, we have the technology available and a bunch of bright minds to do the work, in the field of autonomy. But researchers strongly urge engineering teams, to steer away from arming bots with ammo, and have them do more useful tasks, for human kind. “We ought to use robots to do all the boring and difficult laborious jobs and you put people to work taking care of other people, said Gubrud to the Voice of Russia and then went on, “Using robots for replacing people where people are expensive, well people are expensive but you know, that’s us, we should be in the business of taking care of ourselves and taking care of each other.” Gubrud pointed out that robots could be used in fields such as agriculture and Sharkey mentioned that we’re already using them for surgical procedures. “And it’s also very difficult to hold fully autonomous weapons, killer robots accountable for their actions—so it creates an accountability gap which can undermine deterrence,” stated Docherty who said experts say that it could appear in 20 to 30 years and some say cruder versions could appear in the next few years. Thus, the next step for us as Homo sapiens is to be proactive by creating a ban against the use of unrestrainable devices. We can drive toward a different path which can help society in fulfilling more important roles. As for the battlefield, some duties need that special hintof human touch—something robots can only mimic and not understand –at least in the here and now. Source: Voice of Russia