To the Editor:

Re “Robot Weapons: What’s the Harm?” (Op-Ed, Aug. 17): Jerry Kaplan argues that smart robotic weapons are likely to have far fewer costs in lives and other resources than conventional weapons, and that we have a duty to secure the greatest safety possible for our troops in battle. Moreover, because of their accuracy, use of robotic weapons would result in far fewer noncombatant casualties.

The assumption made by Mr. Kaplan and most others who write on the morality of war is that war is inevitable, so the safest and most effective weapons are also the most ethical to use.

For a small number of us, however, war is unjust, and none of the arguments even for a defensive war are sound. Thus, any weapons that make war easier to engage in are themselves unethical.

Perhaps wars should cost lives, and perhaps wars should be morally complex and perhaps we should look our “enemies” in the eyes before we attempt to take their lives. Perhaps we should confront the worst in ourselves if we are ever to develop the best in ourselves.

For those reasons, I oppose developing and deploying robotic weapons.

JOHN DOUARD

Montclair, N.J.

To the Editor:

Jerry Kaplan includes in his criteria for use of robotic weaponry, “It has to be fully controllable.” But these smart machines will no doubt be reprogrammable, and thus be hackable. Consider your July 24 Business Day article “The Web-Connected Car Is Cool, Until Hackers Cut Your Brakes.”

Technology is susceptible to the law of unintended consequences. We have repeatedly seen dumb weapons repurposed; it can be even worse with smart weapons.

RICHARD E. PATTIS

Irvine, Calif.

The writer teaches computer programming at the University of California, Irvine.