In his latest Times op-ed entitled "Smart Drones" (http://www.nytimes.com/2013/03/17/opinion/sunday/keller-smart-drones.html?pagewanted=all&_r=0), Keller weighs the morality of "fully autonomous weapons" with a special focus on drones. Not surprisingly, Keller points an accusing finger at Israel:
"Israel is the first country to make and deploy (and sell, to China, India, South Korea and others) a weapon that can attack pre-emptively without a human in charge. The hovering drone called the Harpy is programmed to recognize and automatically divebomb any radar signal that is not in its database of 'friendlies.' No reported misfires so far, but suppose an adversary installs its antiaircraft radar on the roof of a hospital?"Israel's Harpy? Sorry, but it only carries a 70-pound high-explosive warhead, which will cripple a radar facility, but not bring down Keller's "hypothetical" hospital. Moreover, unbeknownst to Keller, the issue is anything but hypothetical:
- In its last two engagements with Israel, the Hamas leadership has hidden in the basement of Al-Shifa Hospital in Gaza.
- During Israel's Operation Pillar of Defense in November, the IDF attacked a Hamas intelligence headquarters in the Al-Showa Media Building in Gaza, where Agence France-Presse and Al Jazeera had offices.
- Over the past decade, most of the rockets and missiles fired at southern Israel from Gaza have been launched from heavily populated residential areas.
- Much of Hezbollah's arsenal of missiles for launch against Israel is housed in villages throughout southern Lebanon.
- In the past, Hezbollah has fired rockets at Israel from positions in close proximity to UN bases so as to avoid counterstrikes.
In a nutshell, both Hamas and Hezbollah have not hesitated to make use of human shields in violation of the Geneva Convention.
How did Israel, using its smart weapons, manage during Operation Pillar of Defense, which involved an asymmetric conflict? According to the Israel Defense Forces, 57 civilians died and 120 combatants from Hamas and Islamic Jihad were killed. Compare this ratio with the estimated ratio of three civilians killed for every one combatant in Afghanistan and an estimated four-to-one ratio in Iraq.
Keller concludes his opinion piece by observing:
"If war is made to seem impersonal and safe, about as morally consequential as a video game, I worry that autonomous weapons deplete our humanity. As unsettling as the idea of robots’ becoming more like humans is the prospect that, in the process, we become more like robots."Keller worries "that autonomous weapons deplete our humanity," but consider how Hamas fired more than 10,000 unguided rockets and missiles at Israel's civilian population over the past decade. Was this somehow more humane? And on the subject of morality, where were the protests from Keller, executive editor of the Times from 2003 until 2011 and his op-ed pundits, while this was happening?
Bottom line: War is inhumane; however, Israel has demonstrated that smart weaponry can significantly reduce civilian casualties.
Dear Mr. Grossman,
ReplyDeleteAs a strong supporter of a ban on autonomous weapons, and a critic also of Israeli policies as well as those of the United States and other countries, I nevertheless must agree with you that it is unnecessary and unfair to "point an accusing finger" at Israel as having been "the first country to make and deploy (and sell...)" a fully autonomous weapon system.
Whether Harpy is in fact the first such system is highly debatable and depends strongly on where you draw the boundaries. My view would be that it is simply not true that Harpy is uniquely the first, and singling it out could be interpreted as a gratuitous jab at Israel, even if Mr. Keller is hardly noted as anti-Israel.
I do believe, with many others, that Harpy is one example of an autonomous weapon that should be banned, but since such a ban is only now beginning to be discussed, Israel can hardly be cited for not abiding by it.
I also would not be so sanguine about the effects of a 70-lb warhead on a hospital, or on civilians in other scenarios where the weapon would not be able to determine their presence or distinguish them from combatants.
Harpy is not itself the problem, but only one example, a tip of the iceberg and not even the only tip. I have little doubt that Israel is one of the countries poised to lead in the development of "killer robots" and therefore it is important that Israelis and supporters of Israel not feel that the initiative to ban such weapons is targeted at them. It is not.