Lethal Autonomous Weapon Systems, also known as “killer robots”, are capable of making combat decisions without human intervention. Autonomous weapons are used in many developed military countries around the world, ranging from the US and Russia to Israel and South Korea. Currently there is a dearth of international regulations to govern their use. In general, proponents of autonomous weapons argue that they could reduce the number of soldiers on the field, saving more lives, and can be used in instances where human response may not be rapid enough. Yet it is undeniable that autonomous weapons raise a number of important security, moral and ethical questions. Can states guarantee that autonomous weapons are able to distinguish between combatants and non-combatants? Who should take responsibility for a misstep by an autonomous weapon? And perhaps most importantly, should the decision to take a human life ever be delegated to machines?
The idea of a Weapons of Mass Destruction Free Zone in the Middle East has been floating around ever since 1990. As conflicts in the Middle East now include more and more parties, both from the region and outside it, there is a growing hope that a WMDFZ agreement could mitigate unnecessary loss of life, especially innocent civilians’. However, resolutions on non-proliferation and the establishment of a WMD have been indefinitely extended, with no signs of progress. Making matters worse, Israel has been uncompromising on its arsenal of nuclear weapons, and Iran’s nuclear deal is on the brink of collapse. Delegates will have to debate on a multitude of issues, from the scope of Weapons of Mass Destruction, to the size of the zone, to measures ensuring the longevity of such.
To be announced!