I think often of Ender's Game these days. In this award-winning 1985 science-fiction novel by Orson Scott Card (based on a 1977 short story with the same title), Ender is being trained at Battle School, an institution designed to make young children into military commanders against an unspecified enemy (http://bit.ly/2hYQMDF). Ender's team engages in a series of computer-simulated battles, eventually destroying the enemy's planet, only to learn then that the battles were very real and a real planet has been destroyed.
I got involved in computing at age 16 because programming was fun. Later I discovered that developing algorithms was even more enjoyable. I found the combination of mathematical rigor and real-world applicability to be highly stimulating intellectually. The benefits of computing seemed intuitive to me then and now. I truly believe that computing yields tremendous societal benefits; for example, the life-saving potential of driverless cars is enormous!
Like Ender, however, I realized recently that computing is not a game—it is real—and it brings with it not only societal benefits, but also significant societal costs. Let me mention three examples. I have written previously on the automation's adverse impact on working-class people—an impact that has already had profound political consequences—with further such impact expected as driving gets automated (http://bit.ly/2AdEv8A). It has also become clear that "friction-less sharing" on social media has given rise to the fake-news phenomenon. It is now widely accepted that this had serious impact on both the 2016 U.K. Brexit referendum and the 2016 U.S. Presidential election. Finally, a 2017 paper in Clinical Psychological Science attributes the recent rise in teen depression, suicide, and suicide attempts to the ascendance of the smartphone (http://bit.ly/2zianG5).
A dramatic drop in the public view of Tech, a term that I use to refer both to computing technology and the community that generates that technology, has accompanied the recent recognition of the adverse societal consequences of computing. This decline is well exemplified by Peggy Noonan, a Wall Street Journal columnist who wrote recently about trying to explain (dubiously, IMHO) why Americans own so many guns: "Because all of their personal and financial information got hacked in the latest breach, because our country's real overlords are in Silicon Valley and appear to be moral Martians who operate on some weird new postmodern ethical wavelength. And they'll be the ones programming the robots that'll soon take all the jobs!"
The question I'd like to pose to us in Tech is as follows: We have created this technology; What is our social responsibility? Of course, not all of us sit in Silicon Valley, and not all of us make product-deployment decisions. But much of the technology developed by high-tech corporations is based on academic research, by students educated in academic institutions. Whether you like it or not, if you are a computing professional, you are part of Tech!
Computer Professionals for Social Responsibility (CPSR), founded in the early 1980s, was an organization promoting the responsible use of computer technology. The triggering event was the Strategic Defense Initiative (SDI), a proposed missile-defense system intended to protect the U.S. from attack by ballistic strategic nuclear weapons. CPSR argued that we lack the technology to develop software that would be reliable enough for the purpose of SDI. Later, CPSR expanded its scope to other tech-related issues. The organization was dissolved in 2013. (See Wikipedia http://bit.ly/2zvZsZb) With the benefit of hindsight, the issues that CPSR pursued in 1980s appear remarkably prescient today.
One could argue that CPSR is not needed any more; there are now numerous organizations and movements that are focused on various aspects of responsible use of technology. But our society is facing a plethora of new issues related to societal impact of technology, and we, the people who are creating the technology, lack a coherent voice. ACM is involved in many of these organizations and movements, by itself or with others, for example, ACM U.S. Public Policy Council, ACM Europe Policy Committee, the ACM Code of Professional Ethics, the Partnership on AI, and more. Yet, these efforts are dispersed and lack coordination.
I believe ACM must be more active in addressing social responsibility issues raised by computing technology. An effort that serves as a central organizing and leadership force within ACM would bring coherence to ACM's various activities in this sphere, and would establish ACM as a leading voice on this important topic. With great power comes great responsibility. Technology is now one of the most powerful forces shaping society, and we are responsible for it!
Follow me on Facebook, Google+, and Twitter.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2018 ACM, Inc.
I wholehearted agree with this simple, direct, and soberly argued statement.
But History has unfortunately shown that money dictates despite consequences.
This time, however, a form of singularity is drawing nigh.
I believe nonetheless that computing professionals have a duty of conscience and possess the required skills to address and contain the issues at hand, despite the challenges. Information networking with sentient (human or AI) nodes and multi-way connections is the nascent evolution syndrome. The context is indeed isomorphic to an informational theory applied to Physics of Information, as it may.
At any rate, this was a great note for me to read. I sincerely hope that it be read, discussed, and debated everywhere in our community and their social reaches by transitivity. Let us network this train of thought. Our future depends on it.
It seem entirely credible to hold a thought that specialization, distinction, and difference are an essential component of our human existence, from the genetic DNA algorithms, through species development, evolution and demise. Early Systems Scientists (Bertalanffy and other 1950s) formalized the developing awareness that specialities in science and human knowledge accumulation had become so specialized that communications across disciplines had become difficult, if not nearly impossible.
While this article has nice emotional appeal, it fails to adequately explore the context, and overall system within which CS operates. That automation is automating jobs and putting us all through some adjustment challenges, political and social decisions are failing to adjust to technology and automation. It would so simple to adjust tax policies to incentivize preservation and sustainment of human jobs and works, and keep these in balance with or optimized over automation where it displaces segregates living humans and citizens. The problem addressed here is a social and political problem. Ethics for CS and technology is not to build systems that harm humans or the environment. When such things are done, they should resisted. One way, is for technologies to move on to political careers where science and technology can be introduced into our political processes instead of tribal values and the lowest of human instincts.
Displaying all 2 comments