acm-header
Sign In

Communications of the ACM

Viewpoint

When Technology Goes Awry


person's wrist handcuffed to mobile phone, illustration

Credit: Sararwut Jaimassiri

I begin my book, Digital Minimalism,2 by quoting an essay by the journalist Andrew Sullivan. "An endless bombardment of news and gossip and images has rendered us manic information addicts," he wrote. "It broke me. It might break you, too."5

When I talk to people about their relationship with their digital devices, many report experiences that echo Sullivan. Many people look at screens constantly; not just for work, but while at home, with their children, while in bed, or even in their bathrooms. Some users jump from Hacker News, to email, then over to Twitter to share a take no one requested, then back to email. At best, it is needlessly distracting; at worse, it might break some of you, too.

So I wrote a book that attempted to untangle the forces that pushed many of people toward this place of diminished autonomy, and then provide ideas about how we might reduce this bombardment of our attention. Given the Communications readership, however, it seems to me the details of what is in this book are less important than the question of why someone like me—a computer science professor who primarily studies the theory of distributed systems—is tackling these comparably woolier, public-facing issues in the first place. My answer not only provides insight into my specific path, but more importantly underscores a critical need for engineers in general to get more involved in resolving the increasingly thorny issues generated at the intersection of technology and culture.

To better articulate my call to action for engineers, some brief historical background will prove useful. As an area of inquiry, the philosophy of technology has a long pedigree that stretches from Aristotle's Physics, through Francis Bacon's New Atlantis, to, much more recently, Kevin Kelly's What Technology Wants. The field seems to have coalesced into a more consistent area of inquiry around the late industrial revolution, and in the century and half since, it has produced two competing approaches for understanding the role of tools in human affairs: technological determinism and technological instrumentalism. Roughly speaking, the former philosophy believes the features and properties of a given technology can drive human behavior and culture in directions that are often unplanned and unforeseen, while the latter believes tools are neutral, and what matters in understanding their impact is the cultural context and motivations of the people that develop and use them for specific purposes.

The determinist philosophy received a lot of attention in the second half of the 20th century when a loosely organized group of philosophers, historians, and critics, including Lewis Mumford, Jacques Ellul, Lynn White Jr., William Ogburn, and Neil Postman were publishing big-think idea books about ways in which technology sparks surprising and powerful consequences. A famous example of this thinking is the historian Lynn White Jr.'s 1962 classic, Medieval Technology & Social Change,6 which argues that the arrival of the horse stirrup in medieval Europe accidently sparked the rise of feudalism. (In case you are wondering how this connection works, it goes something like this: The stirrup made it possible to put armored knights on horses, as they kept knights in their saddle after absorbing the blows of lance strikes; this new class of armored shock troops provided an immense warfare advantage that once introduced was necessary to maintain power, but they were also expensive and complicated to support; the division of land into feudal fiefdoms, each supporting a small number of knights, proved to be an efficient economic configuration to solve this problem.)

In recent years, however, the pendulum of power in the formal study of philosophy of technology, especially within academia, has swung in favor of the technological instrumentalists. This shift is well captured by the rise to prominence of a theory known as the Social Construction of Technology (often abbreviated as SCOT), an instrumentalist philosophy that understands technologies' development and impact primarily from the perspective of the underlying social forces influencing the technologists. One of the most well-cited examples of this approach—in some sense, the constructivist response to Lynn White's armored knights standing in their stirrups—is a careful study by Trevor Pinch and Wiebe Bijker of the shifting cultural trends that helped the safety bicycle become more popular than the big-wheeled penny farthing that preceded it.4 To the SCOT theorist, technology is not so interesting on its own: like the physicist studying iron filings displaced by a magnet, technology should mainly be observed to help highlight the underlying power dynamics these theorists believe matter more. (For a more nuanced take on these duals frameworks, I point the interested reader toward Doug Hill's excellent 2016 survey book, Not So Fast: Thinking Twice About Technology.)1

I am reviewing this split because I have come to believe the shift toward instrumentalism, though intellectually interesting and often quite illuminating, is ill-suited on its own to tackle some of the more pressing issues we face in our current moment of rapid technological innovation. As I will describe, to prevent the onslaught of technology (especially in computing) from diminishing our lives and culture, we should be willing in some circumstances to deploy a more determinist view of these tools—a move that will require engineers to get involved.

Engineers are instinctually skeptical of technological determinism. The idea of our tools acting autonomously from human intention seems suspiciously mystical, and given our love of optimization, there is great appeal in the instrumental notion that if a tool is impacting you negatively, it is because you are using it wrong. Based on my close study of these issues, however, I think we often hubristically overestimate our degree of control when dealing with certain innovations.


We should be willing in some circumstances to deploy a more deterministic view of these tools—a move that will require engineers to get involved.


To provide an illustrative example that I have written about before, consider the introduction of an internal email system to IBM in the early 1980s.a Because computing power was expensive, the team tasked with introducing this service first conducted a study to determine how much employees were already communicating through memos and phone calls, with the idea being the bulk of this messaging would be moved to email once it was introduced. Based on their findings, they provisioned a $10 million mainframe that should have had no trouble handling the expected load. Almost immediately, the mainframe overloaded.

"Thus—in a mere week or so—was gained and blown the potential productivity gain of email," joked Adrian Stone, an engineer who was part of the original IBM email team.b When I interviewed Stone about these events, he told me the mere presence of this new tool radically changed how people worked. Not only did they send more messages than they ever had before, they began cc'ing messages to many more people. Within days, the workflow at IBM had transformed from one of occasional messaging to constant communication.

The technological instrumentalist would try to find a social force that explains this change—some group, for example, that realized they could gain advantage by pushing for more frequent communication—but Stone remembers this shift in behavior as much more haphazard, and more recent research backs up this assessment. In her careful study of interactions in the Boston Consultant Group, for example, Harvard Business School professor Leslie Perlow documented a process she calls the "cycle of responsiveness," in which a culture of non-stop emailing emerged from an unstable feedback loop, in which fast responses engendered even faster responses, until the consultants blindly converged to a set of organizational norms for email that no one liked.3 When Perlow introduced new policies that tamed these norms, employee satisfaction and productivity, as measured by surveys, increased significantly.

This is a useful case study of technological determinism: the properties of low-friction digital communication destabilized the social dynamics surrounding communication, leading to a new style of work—ceaseless electronic chatter—that no one planned, and that ended up making employees less happy and less productive. When Perlow interviewed the consultants she was studying, they assumed that someone must have intentionally introduced the culture of hyper-connectivity under which they suffered, but as with the IBM example, no one had. The technology, in some sense, made the decision for them.

To provide a more grandiose example consider the impact of the social media "like" button. Facebook was the first major social media platform to add this so-called feature. As the engineers who developed it reported in contemporaneous blog posts, their goal was to solve a simple technical problem. Many Facebook posts were attracting large numbers of comments that offered generic positive approval: "nice!," "great!," "beautiful!." The engineers worried these short comments were displacing more interesting longer comments, so the "like" button was conceived as a way for users to demonstrate basic approval without needing to leave a comment.

This simple optimization, however, generated an unexpected and profound effect: people began looking at their accounts much more than ever before.c The "like" button, it turns out, transformed the social media experience. In their original incarnation, these platforms provided an easy way for you to post things about yourself and occasionally check on things your friends posted. The "like" button added something new: an incoming stream of social approval indicators. Now you had a reason to keep tapping on the Facebook app throughout the day: to check in on this stream of evidence that other people are thinking about you—a reward that's significantly more appealing than simply catching up on your friends' activities. To make matters worse from the perspective of the user's attention, this stream of indicators is unpredictable: sometimes when you check you receive a lot of feedback, and sometimes you receive very little. As the behavioralists uncovered in their famed experiments of animals pressing levers to dispense food, this style of intermittent reinforcement fosters compulsion.


The "like" button added something new: an incoming stream of social approval indicators.


This small change help spark a massive transformation of not only the social media experience but our relationship with our smartphones. We used to check social media websites occasionally when bored and deployed our smartphones for specific uses, such as looking up directions or playing music while we walked across town (I am ignoring here the early business power users who were already addicted to email on their Blackberries at this point—a different phenomenon). In the post-"like" world, our phones became constant companions that we check incessantly throughout the day, craving the next hit of reward as we become conditioned to fear any downtime. Though I am obviously eliding some other relevant details in this story,d it is reasonable to claim that much like the horse stirrup accidently sparking the rise of feudalism, a small tweak meant to improve the quality of social media comments significantly altered the daily routines of hundreds of millions of people.

We can now return to my proposal that engineers get more involved in our culture's ongoing struggle to react to technological change. In the examples here, tools that were introduced for narrow, often bland purposes—such as making memos more efficient or consolidating comments—ended up creating major impacts that caught many people off guard and did not necessarily serve their best interests. I call these impacts complex side effects, as they are often best understood through the lens of complex system theory: the interaction between humans and machines is complex, and seemingly small changes, like eliminating the friction in intra-office communication through the introduction of email, can create large and hard to predict shifts in the system's behavior. My examples focus on my narrow area of expertise in the study of technology and culture: network systems and their impact on personal and professional productivity. These side effects, however, are relevant to many different topics within this general space, such as AI and automation, data privacy, and algorithmic bias—all subjects where new tools have the potential to create unexpected consequences.

Complex side effects are not well handled by the current academic emphasis on technological instrumentalism. When we view these impacts through the lens of social construction, we are either reduced to the role of the detached observer, or face the daunting challenge of somehow re-engineering social dynamics, an effort that historically sways uneasily between condescension and authoritarianism.


I do not mean to disparage the contributions of existing social scientists thinking about technology and society.


When we instead adopt the perspective of technological determinism, these side effects are stripped of their implicative power, and can become yet another aspect of performance that needs to be measured and addressed as needed. It is here that engineers have a role to play. We are the ones who build these systems, and once deployed, we evaluate them on factors such as their efficiency and security. When short-comings are revealed, we iterate, either trying to improve the system or propose a new approach. Complex side effects should be included in this iterative engineering process.

This applies to systems we directly help create. If you were an engineer on the IBM team that introduced internal email in the 1980s, the fact that your servers created wild and sudden swings in user behavior should have been just as much a concern as lagging performance or dropped packets. This approach also applies to systems created by others. The engineers who introduced the "like" button at Facebook would have had a difficult time trying to tame the excesses it instigated as those excesses turned out to be highly profitable to their employers, but there was nothing stopping engineers outside of Facebook from highlighting the negatives of this complex side effect and suggesting alternative ways to build these systems. (Indeed, this is what former Google engineer Tristan Harris did when he appeared on 60 Minutes in 2017, held up a smartphone, and told Anderson Cooper: "this is a slot machine."e The non-profit he subsequently co-founded, The Center for Humane Technology, proposes design principles that better respect user attention—see https://humanetech.com/).

I do not mean to disparage the contributions of existing social scientists thinking about technology and society. However, given the accelerating rate and increasing impact of technological change, and the antipathy toward technological determinism in the fields that traditionally study these issues, engineers need to join this conversation. Our systems often create powerful complex side effects that are independent of specific human intentions, and we are particularly well situated to rapidly notice and address them. Meticulously researched SCOT analyses are not sufficient by themselves to tame the consequences of the momentous technological innovations that define our current moment.

To return to where I began this Viewpoint, my colleagues and mentors have often wondered why I maintain "two careers" as a writer and engineer, but I no longer see it that way. Exploring complex side effects in my writing is as integral to my scientific obligation as proving theorems about these systems. To adapt the message Samuel Morse prophetically sent during his public introduction of the telegraph, engineers should keep asking, "What have we wrought?," then add the crucial follow-up prompt: "And what should we do about it?"

Back to Top

References

1. Hill, D. Not So Fast: Thinking Twice About Technology. University of Georgia Press, Athens, GA, 2016.

2. Newport, C. Digital Minimalism: Choosing a Focused Life in a Noisy World. Portfolio, New York, 2019.

3. Perlow, L. Sleeping with Your Smartphone: How to Break the 24/7 Habit and Change the Way You Work. Harvard Business Review Press, Boston, MA, 2012.

4. Pinch, T. and Bijker, W. The social construction of facts and artifacts: Or how the sociology of sicne and the sociology of technology might benefit each other In W.E. Bijker, T. P. Hughes, and T. Pinch, Eds. The Social Construction of Technological Systems: New Directions in the Sociology and History of Technology. MIT Press, Cambridge, MA, 1987, 17—50.

5. Sullivan, S. I used to be a human being. New York (Sept. 18, 2016); https://nym.ag/2UjArw6

6. White Jr., L. Medieval Technology & Social Change. Oxford University Press, London, 1962.

Back to Top

Author

Cal Newport ([email protected]) is Provost's Distinguished Professor in the Department of Computer Science at Georgetown University, Washington, D.C., USA.

Back to Top

Footnotes

a. I previously cited this example here: C. Newport, "A Modest Proposal: Eliminate Email," Harvard Business Review Online, February 18, 2016; https://bit.ly/33w0Uus

b. See Adrian Stone's response, posted June 27, 2014, in the following Quora thread: https://bit.ly/399Naac

c. For more on the ways in which the "like" button was developed and its consequences, I recommend the following two resources: Victor Luckerson, "The Rise of the Like Economy," The Ringer, February 15, 2017, https://bit.ly/33xL9Dy; and Alter, Adam. Irresistible: The Rise of Addictive Technology and the Business of Keeping Us Hooked, Penguin Press, New York, 2017.

d. I provide a more detailed accounting of this transformation in Chapter 1 of Digital Minimalism. In this richer account, the "like" button helped Facebook learn that economic value of transforming their service into a source of social approval indicators, after which, in more instrumentalist fashion, they invested heavily in optimizing this effect (a process called "attention engineering").

e. Tristan Harris, CBS "60 Minutes" interview with Anderson Cooper: https://cbsn.ws/2vzncip


Copyright held by author.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2020 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: