acm-header
Sign In

Communications of the ACM

BLOG@CACM

Tech Ethics at Work


View as: Print Mobile App Share:
Robin K. Hill, University of Wyoming

A couple of commentaries on Tech ethics in the latest issue of this publication and the call for comments on the latest version of the ACM Code of Ethics [ACM Code] support the drive to not only identify the ethical problems of Tech, but to address the ethical problems of Tech. (Our reference to the scene as "Tech" follows Moshe Vardi [Vardi].) A browse through a few syllabi for tech-ethics courses reveals some impressive study plans and great references.

Do teaching and codifying ethics effectively improve Tech? Consider a new software engineer, Ardentia, thrilled with her gleaming new workspace, her new refreshments bar, and her clever new colleagues. She is well prepared by her Computing Ethics course to condemn lethal drones, explain trolley problems, expose identifiable data aggregation, and promote the goal of her profession as "computing for the public good."

She is assigned to a product that Marketing describes as "a full suite of data-driven value-added diversity apps for college admissions." Her first task is a function that returns the value SUCCPOT, derived from attribute in the applicant database, current-college, as follows: If it's on the list of Ivy league institutions, return the value High; if it has the word "university" in its name, return Medium; otherwise return Low. She learns that SUCCPOT stands for "success potential." After a few days of coding confusion as she tries to find the other factors that determine success potential, it turns out that current-college is the only criterion examined.

Ardentia:

What? That's not fair.

Possible responses:

...?

This is for advisory purposes, remember? Says so right there.

Ha, ha. Good one. At least they can't see the algorithm.

Well, it finds good students while cutting staff time, and that's what those offices want.

You know HiTechJinx is announcing their own academic recommender suite? Really stinks.

Let's shoot that thought up to the corporate Community and Citizenship group in Canada.

Everybody knows that business ethics problems don't have real solutions.

Super! If you can code and test better selection by next Monday, we'll commit it.

Deep learning is in the plan, as soon as we get our hands on some data.

Wow, you're right. We won't do it. Thanks. Here's a promotion.

Those of you with experience may agree that the likelihood of the last option is negligible. Ardentia discovers that the Community and Citizenship Concern process is an online form that offers "Subject:" choices of "security," "privacy," "intellectual property," and "customer misunderstanding." Even if Ardentia could put a label on her objection, she lacks the words that will compel her company's attention to an issue it ducks. The robust ethical dialectic that she had trusted would emerge, does not.

Later on she is dismayed to find out that the next version aims at college admissions offices known to perform conscientious and thorough evaluation, and pushes those offices to exploit their reputations via customization services that quantify their criteria for program implementation: "High regard means high return! Automate your value proposition!"1

We can talk all day about the right thing to do, but that doesn't help Ardentia. She did talk all day about the right thing to do, back in her Computing Ethics course. But the mode of so doing was not clearly defined. Specifics of time, place, agent, and delivery depend on the situation; they are the details where the devil lives. Ardentia would have been helped, at any rate, by forefronting of the difficulties in realistic scenarios.

Computing ethics programs play out as awareness followed by assessment, stopping short of what comes after assessment. Awareness, we trust, will foster better recognition and recourse. But that acknowledges that the current recourse is not enough. How can realistic ethical training and guidance for professional computer scientists take the next step, into actual mechanisms, actual modes of so doing? What are the topics to address?

Maybe high-tech firms could voluntarily exercise oversight through some respected collaborative body. From what would this body derive its authority? What would be the ethical oversight criteria? How would such oversight be instituted, granted, monitored? Maybe some social impact assessment like an Environmental Impact Statement could be routinely filed before rollout of new products. Who would write that statement, and what decision-making process would use it? What are the mechanisms of certification in other professions, and do any of them apply? What are the medium- and long-term dangers that Tech poses to the pubic good, as best we understand them, and how will that assessment be revised when the need arises? And what are the prospects for voluntary compliance with self-governance in various forms?

If the only answer is government regulation, participants should have opportunities to study that area. Who should formulate regulations? How can they be enforced? These questions lead out of philosophical ethics and into more pragmatic studies. Computing ethics classes and codes would be strengthened by paths to political science and economics, an integration of subjects manifest in the traditional academic Philosophy, Politics, and Economics discipline initiated at Oxford a century ago, but perhaps not the rage in modern Tech (a lowly status shared by its milieu, civil service).

The ACM Code of Ethics reads as guidance for autonomous professionals, which is quite useful, but the consultant's prespective offers an employee only a stay-or-go choice. Certainly, the resolution always available to Ardentia is flat refusal to abet what she sees as immoral, a point also made in a letter to the editor in the January 2018 CACM by N. Poor [Poor]. Quitting is the extreme option. High tech ethics programs, however, seem to aim at constructive action short of that. We must address that need and reject the expectation that Tech will be purified simply by the introduction of ethics courses. Review the ACM Code of Ethics at https://code2018.acm.org/discuss and contribute your own thoughts to the discussion.

1. The author is making this all up, and knows of no such software or enterprise.[back]

References

[ACM Code] ACM Code of Ethics. 2018. https://ethics.acm.org.

[Poor] Poor, Nathaniel. Letter to Editor, "I Am Not a Number." CACM 61:1, page 10. January 2018.

[Vardi] Vardi, Moshe. 2018. Computer Professionals for Social Responsibility. CACM 61:1, page 9. January 2018. DOI: 10.1145/3168007.

 

Robin K. Hill is adjunct professor in the Department of Philosophy, and in the Wyoming Institute for Humanities Research, of the University of Wyoming. She has been a member of ACM since 1978.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account