This column aims to build on and extend the field's understandings of the nature of ethics and equity in computing. Specifically, we argue that issues related to systems of power, which are often absent from conversations around ethics in computing, must be brought to the foreground in K-16 computing education. To this end, we argue for a justice-centered pedagogy5 that centers power by explicitly acknowledging the ethical and political dimensions of computation and builds learning conditions so that everyone—including, but not limited to, students on computer science (CS) or engineering pathways—can understand, analyze, critique, and reimagine the technologies that shape everyday lives.
A power-conscious approach to ethics in computing highlights the socio-political and sociocultural contexts in which technologies are developed and deployed. To respond to the highly complex sociotechnical problems of the 21st century and beyond, future computer scientists and engineers need educational opportunities that prepare them to understand and care about the far-reaching ethical and sociopolitical implications of new technologies. Yet, we must also fundamentally rethink who computing education is for. Serious efforts should be made at the K-12 and undergraduate levels to make the knowledge, skills, and tools to critically examine the relationships between power, ethics, and technology available to all. Given rapidly evolving innovations and contexts of computing, we argue for two changes in our approach to ethics and equity in K-16 computing education:
In recent years, the role of equity in CS education has increasingly become a topic of discussion. Much of this dialogue has centered around the creation of inclusive learning environments in computing, particularly with regard to marginalized students and their communities.3 Yet, often missing from these well-intentioned conversations has been a robust consideration of equity in CS as it pertains to issues of ethics and power. In particular, the ways in which computational tools and technologies have multiple, complex, and profound implications for the lived experiences of nondominant communities have been largely ignored (for example, how machine learning is changing law enforcement practices in communities of color, how automation technologies are reshaping welfare eligibility,1 or how commercial search engines reinforce racist and sexist bias).4 Leaving these power imbalances unexamined precludes deep engagement with issues of equity. In our view, because these complicated interactions of technologies and society shape how nondominant groups experience and negotiate daily life and broader social systems, substantive discussions of equity in CS must intentionally include dynamics of power and ethics.
While there have been a number of important calls and initiatives to integrate ethics into computing education, the tendency has been to ignore how ethics are situated within larger political and ideological contexts. As a result, discussions of ethics are primarily framed as a matter of personal choice and responsibility. For example, the current ACM Code of Ethics and Professional Conduct notes principles such as "Be honest and trustworthy" and "Know and respect existing rules pertaining to professional work." We have no bone to pick with universally accepted traits such as honesty and respect, but we contend that organizing discussions of ethics around the good or bad decisions/values of individual actors obscures more complex interactions between ethics and technology.
Organizing discussions of ethics around the good or bad decisions/values of individual actors obscures more complex interactions between ethics and technology.
Moreover, an honest assessment of ethical behavior (for individuals as well as systems) must include analysis of how people's behaviors contribute to, resist, or otherwise intersect with structures of inequality and hierarchy in society. For example, say an engineer works at a firm where she is instructed to write code that programs handheld helmet-mounted imaging systems designed for the military. The engineer does her job faithfully as an honest, hard-working employee. Her code is elegant, original, and well documented. Yet, by helping to produce this slick and sophisticated technology, she also contributes to the project of militarism around the world. Is she acting ethically? Or we might ask: How do broader ethical and ideological values guide innovation in companies like the one this engineer works for? Does the current and emerging landscape of new technologies (and the institutions and industries creating these technologies) collectively contribute to a more just and ethical society? Centering power in discussions of ethics does not mean answers to these questions are provided for students, but it does mean opportunities are intentionally created for students to discuss, debate, and analyze what others have called the "macro-ethics" of technological systems.2
A focus on power entails providing opportunities for students to decode how computational systems, which we define as coordinated networks of digital tools and devices (for example, the Internet, blockchain technology, surveillance systems), intersect and are intertwined with sociopolitical systems (for example, racism, neoliberalism, militarism, the U.S. immigration system). Decoding requires careful study of these different systems and the ways in which they interact. An unprecedented level of public debate recently has underscored the urgency of attending to these intersections in discussions of ethics and computing. How does racial bias shape artificial intelligence (AI) algorithms? How do theoretical advances in cryptography lay the foundation for mass surveillance? Why are engineers at Google and Microsoft raising concerns about their companies' entanglements with the Pentagon and Immigration and Customs Enforcement (ICE)? Addressing these highly complex questions requires a deeper understanding of how these technological systems interact with sociopolitical systems. For example, exploring racial bias in AI algorithms demands an understanding of visual cognition systems and systems of race and hierarchy. Developing a moral stance on war-related technologies, and evaluating those of others, requires understanding not just how technologies may be used for unethical purposes, but also how the politics of war and empire shape the technologies that are developed in the first place. These are fraught intersections, where ethical dilemmas arise and thrive; where technology and society collide to simultaneously create challenges and opportunities for education and social action.
Focusing on power in discussions of computing and ethics foregrounds justice and equity, and is thus a critical practice that can benefit all members of society. Democratic societies are shaped, filtered, enhanced, and circumscribed by computing technologies and the algorithms driving them, yet these interactions between society and technology are often difficult to discern. Full social and political participation hinges on the ability to perceive and interrogate these interactions. Today's and tomorrow's civically engaged actors must have access to technology and opportunities to develop technical skills, but they must also possess the knowledge, conceptual frameworks, and vocabularies to make sense of, vote, protest, design, and advocate for socially desirable configurations between society and technology. Centering power in considerations of ethics prepares people to foreground how various forms of injustice may be disputed or reproduced when considering interactions between technology and society.
Engaging the ethics and politics of computing demands an unprecedented and vigorous transdisciplinary dialogue between CS and the social sciences and humanities. Computer science instructors will need to move beyond decontextualized modules on ethics or individual courses on social impact that deemphasize moral and political questions. Universities will need to create learning pathways where students gain knowledge and skills to build the technologies of the future as they simultaneously develop the sensibilities and intellectual integrity to question, modify, or reimagine these technologies.
There are encouraging cross-disciplinary developments on the horizon the field should support and continue to foster.
Toward these ends, there are encouraging cross-disciplinary developments on the horizon the field should support and continue to foster. Several universities with highly ranked CS programs are expanding CS learning opportunities in interesting ways (for instance, Northwestern's joint Ph.D. program in Computer Science and the Learning Sciences, and the new interdisciplinary College of Computing at MIT). The digital social sciences and humanities have started to examine the intersections of computational tools and methods in fields such as history, literature, film studies, political science, philosophy, and sociology. Liberal arts colleges are beginning to introduce technology requirements and offer specializations in areas such as artificial intelligence and data science. Much of this work aims to unite computational and humanistic questions in novel ways and inspire new ways of seeing and thinking about computation and its place in our society and lives. In middle and secondary computer science education, however, ethical and political dimensions of computing tend to be sidelined, including within introductory courses such as Exploring Computer Science (ECS) or CS Principles.5 A pedagogical focus on power and ethics in K-12 CS education has the exciting potential to forge new disciplinary bridges between the goals and practices of CS and parallel efforts to engage youth in civics and social justice. Additionally, intentionally broadening the intellectual and social purposes of CS could invite a wider range of student identities.
For computing education as a field to rethink ethics and equity in ways called for here will undoubtedly require a hard (and perhaps uncomfortable) epistemological and pedagogical pivot. We would do well, though, to remember a rich intellectual history of thinkers in our field who have laid a foundation upon which we may build. For instance, mathematician, philosopher, and pacifist Norbert Wiener forwarded a view of ethics rooted in the fundamental relationships between science and power. Especially in his later writings, he urged the field to take seriously the ways machines may alter society in ways that would challenge the very meaning of human life.6 More recently, Jeannette Wing's contention that computational thinking is "a universally applicable attitude and skill set [that] everyone, not just computer scientists" can learn and use7 helped spark an enduring debate about computation's transdisciplinarity and its untapped potential to inspire new ways of seeing the world. We see much value in these early formulations, particularly with regard to their emphasis on the power of computing to transform society. Highlighting power as a conceptual and pedagogical approach locates learning about computing within a justice frame that both complements and challenges previously articulated visions for computing education.
Robust understandings of power, ethics, equity, technologies, and society—as called for in this column—are key for the design of future tools and artifacts rooted in deep notions of the public good and social welfare. Future generations must possess the ability to critically analyze the affordances and constraints of technological advancement, as well as the moral imagination and technical skill to create with compassion and ethical integrity.
1. Eubanks, V. Automating Inequality: How High-Tech Tools Profile, Police, and Punish the Poor. St. Martin's Press, New York, NY, 2018.
2. Herkert, J.R. Ways of thinking about and teaching ethical problem solving: Microethics and macroethics in engineering. Science and Engineering Ethics 11, 3 (Mar. 2005), 373–385.
3. Margolis, J. Stuck in the Shallow End: Education, Race, and Computing. MIT Press, Boston, MA, 2010.
4. Noble, S.U. Algorithms of Oppression: How Search Engines Reinforce Racism. NYU Press, NY, 2018.
5. Vakil, S. Ethics, identity, and political vision: Toward a justice-centered approach to equity in computer science education. Harvard Educational Review 88, 1 (Jan. 2018), 26–52.
6. Wiener, N. Some moral and technical consequences of automation. Science 131, 3410 (1960), 1355–1358.
7. Wing, J.M. Computational thinking. Commun. ACM 49, 3 (Mar. 2006), 33–35.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2019 ACM, Inc.
No entries found