They Are the LAWS: Killer Robots and Trump
Let’s talk intelligent, killer robots. The United Nations already is. One of the technical terminologies the UN and partner organizations are using is lethal autonomous weapons systems or LAWS. And for the time being, let’s avoid thinking about Isaac Asimov’s Three Rules because the robots we are looking at are specifically meant to be lethal weapons systems and not just assistants who will serve humans except when it harms them. Before you start thinking that this is just science fiction alarmism, remember that people think H.G. Wells may have influenced the creation of the atomic bomb or that Aldous Huxley predicted mood altering drugs. So, we’re going to take this seriously, because you’d better believe that at a time when the United Nations is trying to find their position that Trump will end up playing a role.
So here are some basic facts about LAWS:
There is no universal definition of what counts as a lethal autonomous weapon system.  This will be one of the first issues the United Nations needs to address. For example, a landmine does not need a human’s command to detonate but not everyone considers it to be an autonomous weapon system. For the most part, the United Nations will likely be focusing on robotic weaponry, meaning a system that can carry out a complex series of functions. For example, an autonomous weapons system will need to detect and deliberately target something, which a landmine cannot. Human Rights Watch considers there to be three different types of autonomous robotic weapons; human-in-the-loop systems where robots select targets but only deliver force with a human command; human-on-the-loop systems where robots target and deliver force with human supervision that can override a command; and human-out-of-the-loop systems where robots can target and deliver force without any human oversight.  We do not know for certain which of these types the United Nations will make as their primary focus for an international agreement or treaty. Vague terms such as “meaningful human control” and the difficult distinction between autonomous and automated are also sticking points for delegations when it comes to deciding what kinds of weapon systems the United Nations should even be discussing. The “meaningful human control” statement comes from a UN report on peaceful assemblies that recommended that: “Autonomous weapons systems that require no meaningful human control should be prohibited.”  Some delegations believe that too concrete of a definition will lead to developers finding loopholes to continue building similar systems while others believe that leaving the definition too vague could impede progress on unmanned devices meant to serve purposes like exploration. 
The effort to ban LAWS is pre-emptive. Most of the weapons systems that the United Nations wants to address do not yet exist, which adds to the difficulty in defining them. In many previous weapons bans, such as those regarding landmines or nuclear weapons, the devices had been around for decades before the international community developed a treaty banning them. This preemptive ban would not be the first though, as there were efforts to restrict air warfare before the first military plane was ever commissioned and certainly before Giulio Gavotti dropped grenades that injured no one in the first aerial bombardment from a plane. 
The United Nations are not the only ones concerned about killer robots. Other international organizations have released reports and statements urging action in banning these systems. Additionally, technology experts like Elon Musk and Stephen Hawking have urged the UN to ban killer robots before it is too late.  Musk and Hawking are particularly nervous that an arms race over artificial intelligence may cause World War Three.
Trump’s people aren’t all on-board with a ban. Stephen Groves served as Nikki Haley’s Chief of Staff before recently signing on as a Deputy White House Counsel and also worked as a senior fellow at the Heritage Foundation, all which culminates to be a highly impressive resume and should signify that he is capable of putting serious thought into his positions. In 2015, he wrote a piece for the Heritage Foundation urging America to continue working on LAWS and oppose any ban on them.  The crux of his argument can be seen in this excerpt:
“The U.S. is a leader in the development of LAWS, and should continue to be so. That's the only way U.S. armed forces can retain a tactical and strategic advantage over its enemies in future conflicts. The U.S. delegation should block any effort to ban LAWS or regulate them out of existence.”
Groves also notes the lack of any one concise definition for LAWS, the fact that other countries may still continue working on LAWS even if the United States signs onto a ban, and the need for the United States military to remain competitive, all of which ties strongly to the Realist perspective of international theory. Groves’ proximity to Haley and Trump should be a strong indicator that this sentiment likely carries weight in the White House.
Quite frankly, Grove’s piece is well-written and accounts for the possibility that LAWS could be programmed in some ways to conform to international laws of warfare. However, there is still a distinct possibility that with the incorporation of artificial intelligence or the potential for cyber-attacks on our nation’s capacities the robots would lose this conformity and begin breaking international law on their own initiative. Programming can fail and there is no guarantee that LAWS produced by American companies would not end up harming American soldiers or civilians. In an even more likely scenario, there is no guarantee that when America deploys LAWS that it will be able to distinguish between irregular and guerrilla fighters, which are becoming the new norm for combatants, and civilians as they may very well be wearing the same clothing.
Many of the concerns that Grove cites are the same ones that proponents of nuclear weapons use to justify their arguments. As the United Nations continues to work toward a ban they will likely run into the same issues that have plagued the new nuclear weapon ban from earlier this year. One of the more major issues is likely going to be that the President of the United States will oppose any ban the United Nations tries to implement. Of course, the United Nations cannot explicitly compel any state, and certainly not the United States, to adopt, sign, or ratify any ban they may produce and it is also just as likely that the international community will create a treaty outside of the United Nations as they did in Ottawa with landmines. Still, Trump may choose to attack a ban as an affront to national sovereignty or try to paint the United Nations as panicked fear-mongers and himself as a man of vision leading the way to a technological future. I beg you to please not buy into this. The United Nations is working to prevent the proliferation of a technology that has the capacity to wreak serious havoc and bring tremendous amounts of violence to a world already struggling to reduce the intensity of armed conflicts. That is what is at stake here.
The United Nations is meeting in Geneva as I write this piece. The Convention on Certain Conventional Weapons (CCW) does not yet cover killer robots in its text and the Campaign to Stop Killer Robots, coordinated by Human Right Watch, has criticized the CCW for “dragging its feet” at the international level.  This does not mean that there has not been progress at the national or corporate levels. If you are interested in learning more about the progress of the Campaign to Stop Killer Robots, visit their website, StopKillerRobots.org, and social media for updates. The Campaign is made of 64 organizations in 28 countries, including Nonviolence International.
I genuinely believe this ca