Persuasion has always been part of the human experience. From ballads to bible stories, parents to personal trainers, people have always sought to influence others' attitudes and behaviors. Although many of us resist the idea of being persuaded, most of us seek skilled persuaders for ourselves and our significant others.
Can computers persuade? Yes they can. And like the human persuaders in our lives, persuasive computing technologies can bring about constructive changes in many domains, including health, safety, and education. In the process, computers can help us improve ourselves, our communities, and our society. But persuasive computers can also be used for destructive purposes; the dark side of changing attitudes and behaviors leads toward manipulation and coercion.
In order to achieve the potential and avoid the pitfalls of persuasive computing, a small but growing group of ACM members has been exploring the theory, design, and analysis of computers as persuasive technologiesan area we call "captology" (based on an acronym derived from Computers As Persuasive Technologies; see www.captology.org).
To be sure, not all technologies are persuasive; in fact, only a small subset of today's computing technologies fit this category (see Figure 1). As we see it, a persuasive computing technology is a computing system, device, or application intentionally designed to change a person's attitudes or behavior in a predetermined way. This point about intentionality may be subtle but is not trivial. Intentionality distinguishes between a technology's side effect and its planned effect. Captology focuses on the planned persuasive effects of computer technologies.
As you'll see in this special section, examples of persuasive technologies include a computerized doll designed to motivate responsible sexual behavior, a CD-ROM that persuades kids to eat fruits and vegetables, and a virtual social environment that increases safety by motivating responsible drinking. One thing to note from these and other examples is that persuasive computers function in three basic waysas tools, as media, or as social actorseach affording different pathways to persuasion (see the sidebar below).
Because the study of computers as persuasive technologies is such a new endeavor, many key questions remain unanswered, including:
During the past few years, captologists in universities and industry have increased our knowledge of key issues in this area, but one thing is clear to those of us close to the domain: Not only do we, as a scientific community, need to understand more about the persuasive technologies that already exist, we need more insight into what could exist, and perhaps more important, what should exist. This special section is a step toward answering these questions and inviting others into the discussion.
Providing a backdrop for the subsequent articles, King et al. describe and analyze the persuasive interactive technologies that already do exist, including applications, users, form factors, and strategies. It reviews the current landscape of persuasive technologies, offering glimpses of what's coming just over the technological horizon, as well as several promising commercial applications that have already found a market.
The article by Tseng et al. focuses on issues of credibility as they apply to computing systems, defining credibility and outlining its importance in computing systems. Surprisingly, there is little public research on computers and credibility. To raise awareness and inspire additional work, this article suggests new frameworks for understanding the dynamics of computer credibility.
What follows is perhaps the most controversial article in the section, addressing "seductive computing." Not only is seduction a controversial type of persuasion, but Khaslavsky et al. push the limits of scientific tradition by drawing their personal insights from industrial design and popular culture to detail the potential for computing experiences that seduce.
The section concludes with an examination of the ethics of persuasive technologies. Berdichevsky et al. first lay the foundation for discussing ethics in this domain, then boldly articulate their guiding principles for designing ethical persuasive technologies.
I don't expect readers to agree with all the ideas put forth here, but I hope these articles provoke and inspire you to raise key questions and discuss how they relate to these technologies. Whether or not we address these questions, we will soon see more examplesgood and badof computers designed to change human attitudes and behaviors. Increasingly, we will see computers in new roles motivating health behaviors, promoting safety, promoting eco-friendly behavior, and selling products and services. Still other persuasive technologies will emerge in areas we can't yet predict.
This forecast may sound like bad newsa world full of inescapable computer technology constantly prodding and provoking us. While such a technological environment could develop, in most of the important cases, we'll choose the technologies we want to persuade usjust as we choose a personal trainer at the gym or a tutor for our children. And even though certain types of persuasive technologies will be imposed upon us, we will learn to recognize and respond appropriately to their persuasive appeal. In extreme cases weas an ACM communitywill need to help create the public policy that influences the design and uses of computers as persuasive technologies.
But to effectively shape the future landscape of persuasive technologies, we first have to educate ourselves and others about the related potential and pitfalls. By understanding persuasive computing, designing responsible computing technologies, and discussing and acting on ethical issues in this domain, we create for ourselves the opportunity to leverage the power of persuasive computing to improve our lives, our communities, and our society.
1. Nass, C., Fogg, B., and Moon, Y. (1996). Can computers be teammates? Int. J. Hum.-Comput. Stud. 45 (1996), 669678.
2. Reeves, B., and Nass, C. The Media Equation: How People Treat Computers, Television, and New Media Like Real People and Places. Cambridge University Press, New York, 1996.
©1999 ACM 0002-0782/99/0500 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.
No entries found