acm-header
Sign In

Communications of the ACM

Privacy

Metrics for Success: Why and How to Evaluate Privacy Choice Usability


figures in work-related tasks or settings, illustration

Credit: Visual Generation

Privacy regulations around the world frequently include requirements for websites and apps to obtain informed consent from users prior to collecting, processing, or sharing their personal information, or to provide easy opportunities for users to opt-out of certain uses of their data. This has led to a proliferation of privacy choice and consent interfaces, many of which provide consent opportunities that are hardly informed, frequently difficult to find and use,5 and all-too-often deceptive.7

Examples of bad privacy choice and consent mechanisms are easy to find, and figure prominently in recent regulatory actions. Cookie consent banners frequently nudge users to accept all cookies by making that choice most prominent and requiring users to follow a link to a secondary interface if they want to take any other action.2 In December 2022, the U.S. Federal Trade Commission secured two settlements with Fortnite video game creator Epic Games, totaling $520 million in fines and refunds due to a number of violations, including some related to deceptive interface design. Among other problems, the FTC explained, "Fortnite's counterintuitive, inconsistent, and confusing button configuration led players to incur unwanted charges based on the press of a single button."3

Privacy choice mechanisms are understandably not a high priority for companies, and in some cases they may have an incentive to make them difficult to find and use so that users are more likely to consent to the use of their data and less likely to opt-out. However, choice and consent mechanisms with poor usability show that companies care little about customers and their privacy, and increasingly may violate the law.

Back to Top

A Need for Usable Privacy Choice Metrics

While usability metrics and evaluation methods abound, there are no standard approaches to assessing the usability of privacy choice and consent mechanisms. Many common metrics and evaluation methods are applicable here, but it is important to keep in mind that when users interact with privacy choice interfaces, they typically are doing so as part of a secondary task. For example, users may sign up for a social media platform with the goal of keeping in touch with their friends. When presented with a privacy choice interface during the onboarding process they may face conflicting goals: get through the onboarding quickly and start chatting with their friends or take the time to review privacy choices carefully and configure their settings to match their privacy preferences. Interfaces that take the time to explain privacy choices and make sure users understand them may be great for privacy, but may have a negative impact on the usability of the platform if it lengthens the on-boarding process. Thus, in some cases there may be a tension between improving the usability of choice interfaces and the overall system usability. Choice interface designers may also need to overcome habituation from similar privacy choices users have encountered on other platforms in the past and privacy interface fatigue.


Privacy choice mechanisms are understandably not a high priority for companies.


Organizations often employ user engagement as a proxy for usability; the assumption is if you make a user interface change and more users engage with your services then the change must have made your website more usable, or at least in some way more appealing or satisfying to users. However, if you change your consent interface and more users consent, does that mean the interface has improved and users are more satisfied? Or does it mean you tricked more users into consenting to something they did not really want? It is difficult to know unless you can get to the bottom of what users actually understood and intended.

In our research at Carnegie Mellon University we have been studying the usability of privacy choice and consent interfaces for many years. In order to take a step toward a more systematic evaluation approach, we set out to define a set of relevant usability metrics and develop a framework of methods for assessing them.4

We reviewed usability definitions in both the human-computer interaction and privacy literature and identified seven requirements for usable consent interfaces.6 Briefly, a usable consent interface should:

  • address user needs;
  • require minimal user effort;
  • make users aware of what choices exist and where to find them;
  • convey choices and their implications so users understand them easily;
  • satisfy users and engender trust;
  • allow users to change their decision due to errors or changing their mind; and
  • avoid nudging users toward less privacy-protective options.

Digging deeper into these seven requirements, we derived 27 evaluation criteria. For example, to determine whether an interface conveys choices and their implications so users understand them easily we identified four comprehension-related criteria. We can ask users to interact with a consent interface as they normally would and then when they are done quiz them on factual questions related to the choices and their implications (unfocused attention). As normal interactions with consent interfaces are often hurried, some users may have misconceptions due to not paying much attention. So we might also want to encourage users to read the information on the interface carefully and then quiz them to see whether the information is understandable to users who actually take the time to read it (focused attention). We might also ask users how easy or difficult it was to understand the choices or to identify aspects they found easy or difficult (perceived effort). Finally, experts might review the interface and use heuristics to estimate how difficult it would be for users of various types to understand (estimated effort).

Back to Top

Methods For Evaluating Privacy Choice Usability

We looked for examples of how usability requirements had been assessed in prior usable privacy studies and derived guidelines about what assessment methods would be most useful. Looking at the comprehension criteria again we see that each criterion provides different types of insights. Estimated effort assessments by experts can be useful for quickly finding particularly problematic interfaces without the time and expense of a user study. Focused attention assessments with users can help measure comprehension in a best case scenario; if users do not understand their choices when they are actually paying attention there is little hope they will understand when they are rushing to get past a consent interface. These types of studies can often be done quickly and inexpensively, for example, with online crowd workers. Unfocused attention and perceived effort assessments with users in a real or realistic context provide insights on what actually happens in practice, although they sometimes reveal that even when an organization makes a concerted effort to communicate, it is difficult to overcome users' prior misconceptions.8 Depending on context, these studies may require more resources to conduct.


Choice interface designers may also need to overcome habituation from similar privacy choices users have encountered on other platforms in the past and privacy interface fatigue.


As our discussion of comprehension criteria suggests, there are many methods that might be used to assess usable consent criteria, including expert evaluation and user studies of various forms.9 The time and resources required can vary considerably depending on method and scale of study. While resource-constrained organizations might dismiss the idea of conducting a usability assessment, even a small assessment can prove useful for spotting some of the most egregious problems and improving the consent experience. A variety of factors may be considered when selecting an assessment strategy, including available time and resources, development stage (are you evaluating an already deployed interface or one still in the early design phase?), and whether the privacy choice interface is interruptive (it pops up and interrupts the user's task) or on-demand (users have to find it when they want to use it).

Assessments involving user studies can be grouped into three categories: those with no assigned task, those in which participants are assigned a privacy task, and those in which participants are assigned a distraction task.

Studies with no assigned tasks can be further categorized into self-reported and observed. Self-reported studies include interviews or surveys about participants' privacy-related perceptions. Observation studies involve observation (generally through instrumented interfaces and weblogs) of users interacting with deployed privacy choice interfaces, sometimes as part of an A/B test. While interviews and surveys can provide rich information about users' needs and perceptions, user observations provide insights into what users actually do but may not offer explanations about why unless paired with an exit interview or survey.


Organizations that want to support their users in making informed privacy choices should do some testing.


Studies with assigned privacy tasks allow for observation of how users will interact with a privacy choice interface in a hypothetical scenario. Participants may be given a scenario and asked to exercise a particular privacy choice (for example, figure out how to opt-out of marketing email from this company), or asked to make a personal privacy choice (for example, configure the social media platform's privacy settings according to your personal preferences). A privacy task study can help assess usability for users who are specifically seeking out a privacy interface, but may not be helpful for assessing interruptive choice interfaces or assessing awareness of choices. Instead, studies may include distraction tasks to focus user attention on a typical primary task (for example, making an online purchase, creating a social media post, playing a video game) and observe what happens when a privacy choice prompt interrupts the task or whether users notice the availability of privacy settings that might be relevant to the task.

Back to Top

Put Privacy Choice Interfaces to the Test

We have talked with industry practitioners who are interested in improving their organizations' privacy choice interfaces but are unsure how to pick the most usable interface. We have also talked with regulators who wonder how they should assess whether companies are complying with usability requirements. While there is not a recognized industry standard in this space, academic researchers have been conducting and publishing user studies of privacy choice interfaces for over a decade. Our privacy choice evaluation framework builds on a review of prior work and offers a roadmap for organizations that want to assess their own privacy choice interfaces with the goal of making them more usable. Numerous academic papers offer assessment protocols that organizations might use for inspiration. Organizations that want to support their users in making informed privacy choices should do some testing.

Regulators should also consider using our privacy choice evaluation framework as they hold companies accountable. Demands for privacy choice interface usability improvements should be accompanied by requirements for rigorous testing to demonstrate that interface changes have actually improved usability.

Finally, it is important to recognize that even the best privacy choice interfaces are a burden to users. We would like to see more comprehensive privacy regulations, standardized privacy choice interfaces, and "personal privacy assistant" tools1 that can automatically make privacy choices based on a user's preferences. But even with more privacy regulations and user-oriented tools, some privacy choices are likely to require user attention and therefore a user interface—hopefully one that has been tested for usability.

Back to Top

References

1. Colnago, J. et al. Informing the design of a personalized privacy assistant for the Internet of Things. In Proceedings of the Conference on Human Factors in Computing Systems (CHI). ACM, 2020.

2. Cranor, L.F. Cookie monster. Commun. ACM 65, 7 (July 2022), 30–32; https://bit.ly/3wgHbyi

3. Fair, L. $245 million FTC settlement alleges Fortnite owner Epic Games used digital dark patterns to charge players for unwanted in-game purchases. Federal Trade Commission Business Blog. (Dec. 19, 2022); https://bit.ly/3IYdCZV

4. Habib, H. and Cranor, L.F. Evaluating the usability of privacy choice mechanisms. In Proceedings of the Eighteenth Symposium on Usable Privacy and Security (SOUPS'22). (USENIX Association, USA, 2022), 273–289.

5. Habib, H. et al. It's a scavenger hunt: Usability of websites' opt-out and data deletion choices. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20). Association for Computing Machinery, New York, NY, USA, 2020, 1–12; https://bit.ly/3ZDL3Xo

6. Habib, H. et al. Okay, whatever: An evaluation of cookie consent interfaces. In Proceedings of the CHI Conference on Human Factors in Computing Systems (CHI '22), (Apr. 29–May 5, 2022, New Orleans, LA); https://bit.ly/3Wg9d7u

7. Nouwens, M. et al. Dark patterns after the GDPR: Scraping consent pop-ups and demonstrating their influence. In Proceedings of the 2020 CHI Conference on Human Factors in Computing Systems (CHI '20). Association for Computing Machinery, New York, NY, USA, 2020, 1–13; https://bit.ly/3GQNjC7

8. Pearman, S. et al. User-friendly yet rarely read: A case study on the redesign of an online HIPAA authorization. In Proceedings on Privacy Enhancing Technologies, 2022(3), 2022; https://bit.ly/3HfSzAM

9. Schaub, F. and Cranor, L.F. Usable and Useful Privacy Interfaces, In An Introduction to Privacy for Technology Professional. Travis D. Breaux, Executive Ed. (2020); https://bit.ly/3CZ15Bp

Back to Top

Authors

Lorrie Faith Cranor ([email protected]) is Director and Bosch Distinguished Professor in Security and Privacy Technologies, CyLab Security and Privacy Institute and FORE Systems Professor, Computer Science and Engineering & Public Policy, Carnegie Mellon University Pittsburgh, PA, USA.

Hana Habib ([email protected]) is Associate Director of Privacy Engineering Academic Programs in the Software and Societal Systems Department at Carnegie Mellon University, Pittsburgh, PA, USA.


Copyright held by authors.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2023 ACM, Inc.


 

No entries found