As a combat veteran and more recently an industry technologist and university professor, I have observed with concern the increasing automation—and dehumanization—of warfare. Sarah Underwood's discussion of autonomous weapons in her news story "Potential and Peril" (June 2017) highlighting this trend also reminded me of the current effort to update the ACM Code of Ethics, which says nothing about the responsibilities of ACM members in defense industries building the software and hardware in weapons systems. Underwood said understanding the limitations, dangers, and potential of autonomous and other warfare technologies must be a priority for those designing such systems in order to minimize the "collateral damage" of civilian casualties and property/infrastructure destruction.
Defense technologists must be aware of and follow appropriate ethical guidelines for creating and managing automated weapons systems of any kind. Removing human control and moral reasoning from weapons will not make wars less likely or less harmful to humans.
No entries found