Permission marketing requires consumers' consent before a Web site can track them with cookies, or send them marketing email, or sell their data to another company. Yet a study by Cyber Dialogue found that 69% of U.S. Internet users did not know they had given their consent to be included on email distribution lists. Here's how it's done: Using the right combination of question framing and default answer, an online organization can almost guarantee it will get the consent of nearly every visitor to its site. Although lists of people who have supposedly opted-in for permission marketing schemes are valuable sources of revenue for Web sites, high response rates alone do not mean these lists contain valuable customers.
We systematically explored the influence of question framing and response defaults on consumers' apparent privacy preferences in two online experiments detailed in [1]. The participants in these experiments were members of the Wharton Virtual Test Market, an online panel of over 30,000 Internet users representative of the U.S. Internet population. The results of our experiments highlight the need for all online consumers to pay close attention to what they agree to when they send responses to a Web site.
If consumers had fixed policies about the privacy of their data, then asking them to opt-out or opt-in to a Web site's privacy policy would make no difference to their answer. However, evidence suggests that most consumers decide how much of their private information to release to a site on a case-by-case basis. The problem with making up your mind on the spot, though, is the answers you give are often influenced by the way questions are asked, as a long history of decision-making research shows [2]. We found that simply framing the question as an opt-out instead of an opt-in changes privacy preferences. Also, privacy policy questions are often displayed with a "yes" response checked by default. Default answers take advantage of inattention, cognitive, and physical laziness, and the tendency of decision-makers to view the default as the standard of comparison, or as the popularly endorsed, or correct answer [3]. We found that if marketers wanted most people to say "yes" to their privacy policy, all they have to do is make "yes" the response recorded if a consumer takes no action.
Figure 1 shows two variations of a question posed to 134 respondents in an online survey about whether they wanted to be contacted regarding health-care surveys. Both questions used the checkbox format commonly employed by Web sites when asking consumers whether they want to opt-in or opt-out of permission marketing schemes. Individual respondents saw only one version of the question, either asked positively (opt-in: "Notify me about more health surveys") or negatively (opt-out before data is used: "Do NOT notify me about more health surveys"), with the checkbox blank. Next to each question we've shown what the result of accepting the default answer to the question would be, and the percentage of respondents who saw that question and agreed to receive further email messages.
It is obvious the way the question is asked makes a substantial difference, as the percentage of people agreeing to be contacted for future surveys is not the same across questions. The opt-in question, where the no-action default is to not participate, produces a participation rate (48.2%) half the size of the opt-out question (96.3%), where the no-action default is to participate. Interestingly, we achieved these effects when our question was set in the same large typeface as the rest of a form on which our participants had to answer every question. Defaults and framing are likely to have even more impact when, as is often the case, the question is set in a miniature font, or answering most questions is optional, or the implications of answering are buried in a large privacy policy document.
We also investigated the less commonly used radio-button input format, which allowed us to measure both the options"yes" and "no"for a question framed either positively (opt-in) or negatively (opt-out). It also allowed us greater flexibility when manipulating the default option. For example, we could set both options to be blank, so that neither agreement nor disagreement with the question could occur by default. We could then compare the rate of participation when people were given the option of doing nothing (accepting "yes" or "no" answers checked by default) with the situation where people were forced to actively make a response (the health questionnaire couldn't be submitted if both "yes" and "no" were blank). Figure 2 lists the questions we asked a further 235 respondents and their resulting participation rates.
Again, the form of the question produced sizable differences in participation. Looking first at the more considered responseswhere neither "yes" or "no" was checked by default and respondents were forced to actively indicate their preferencewe again see an effect of question framing. With the radio-button format, more people respond to the positive (opt-in) framing rather than the negative (opt-out) framing (88.5% vs. 70.8%). When no-action default responses are allowed, participation goes down about 20% when doing nothing results in no participation, for both the positive and negative framings (to 44.2% and 59.9% respectively). In contrast, when doing nothing results in participation, participation increases by 6.1% for the negative framing, but by less than 1% for the positive framing (Figure 3).
Our experiments show the format of privacy questions can influence a consumer's apparent agreement with privacy policies. Opting-in does not equal opting-out, and answers are influenced by the default option. Our research has implications for privacy regulation currently being considered in the U.S. and the implementation of the opt-in policy stipulated by the European Union Data Privacy Directive (EUDPD). Regulation that genuinely aims to protect consumers from privacy infringement should also stipulate the form of the question asking for a consumer's consent. If the goal of policymakers and marketers is to separate interested from uninterested consumers, the best way of controlling the sizable effect of no-action defaults is to neutralize them as much as possible, that is, to use a radio-button format with no defaults. Preferably, no data collection or use should occur until a definite answer has been received from the consumer. Web sites could get answers immediately by forcing response, as we did in our second study. The question of which frame is most appropriate is more difficult. No-action defaults did not substantially increase participation in our second study, suggesting that participating in future surveys (for which prizes would be awarded) was very popular with our participants. For less popular outcomes, such as receiving marketing email, the frame used could artificially increase participation.
While this research does not conclusively identify the single best way of asking privacy questions, it does make some suggestions about better ways of doing it. More importantly, perhaps, we illustrate that it matters how a question is asked. The opt-out policy likely to be introduced as a baseline privacy law in the next session of the U.S. Congress disagrees with the opt-in policy employed by the EUDPD. Our research shows this is not just a political difference, but one that will make substantial difference to the number of people who participate in the kind of activities (such as cross-selling, customization, and email marketing) that are expected to be the primary sources of e-commerce profitability.
1. Johnson, E.J., Bellman, S., and Lohse, G.L. Defaults, framing and privacy: Why opting in ≠ opting out. Working Paper (2000). Columbia Business School, Columbia University, New York, NY.
2. Kahneman, D., and Tversky, A. Choices, values, and frames. Amer. Psychol. 39, 4 (1984), 341350.
3. Samuelson, W. and Zeckhauser, R. Status quo bias in decision making. J. Risk and Uncertainty 1, 1 (Mar. 1988), 759.
Figure 1. Checkbox format questions for participation in health surveys.
Figure 2. Radio-button format questions for participation in health surveys
Figure 3. The influence of framing and no-action default settings on participation rate.
©2000 ACM 0002-0782/01/0200 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2001 ACM, Inc.
No entries found