acm-header
Sign In

Communications of the ACM

Privacy

Is a Privacy Crisis Experienced, a Privacy Crisis Avoided?


Project Amelia promotional image

Credit: Bricolage

In 2019, in a former railroad terminal in Pittsburgh, PA, a small tech start-up revealed its groundbreaking AI technology. They were met by a company whistleblower and an untimely zero-day hack on their system. This series of events would go on replay to audiences of 50 people for eight weeks, each night with a slightly different ending.

The immersive theater production that caused these events, Project Amelia,a was written by a technologist (and the first author) and supported by a group of Carnegie Mellon University researchers to expose the public to the potential crises of our technical future and learn from their reactions. Project Amelia tried to challenge the incongruence between people's privacy preferences and behaviors. People's beliefs about privacy and data sharing are often not informed by lived experiences that align risk perceptions to reality. Our hypothesis was that a performed encounter with a plausible privacy crisis may push people to act before facing a real crisis—an identity theft, an arrest, a stalker, or a stark realization that your data may be baked into a new product that appears to be dystopian.


Our project brought privacy risk to participants in a safer way: through immersive theater.


Our project brought privacy risk to participants in a safer way: through immersive theater.b Delayed precaution and passivity are not unique to privacy concerns, but also climate change, epidemiology, social equity, and other "wicked problems" where we appear to be challenged on how to instrument meaningful change. This article looks at our attempt to create and study the impact of an experiential narrative that gave people an embodied role and agency to explore a crisis that involves individual privacy and autonomy. This expands on a growing body of work around experiential futures that generates impactful experiences to support people in bridging the "gulf"3 of what they see day-to-day and what they may experience if they were someone else, somewhere else, or in a time yet to come.

This performance also served as a testbed for ongoing research. Our team sought to explore whether this immersive experience of a privacy crisis changed behavior of those who attended. To do this, we observed performances, gathered interaction data, and asked attendees and actors to complete surveys and interviews.

Back to Top

A Primer on Project Amelia

Unlike traditional performances, immersive theatre invites audiences to engage with actors in a less-structured format. Instead of assigned seats, each member of the audience can freely explore the space, be a "fly on the wall" as scenes are performed, or interact with cast members one-on-one.1 In some cases, the audience can even influence how the narrative might unfold. The result is that each person walks away with an experience that is unique to how they have chosen to navigate the event, and each night is special to the audience that participated.

Our research team had the opportunity to work closely with the large-scale immersive production Project Amelia,c written and conceived by team member Michael Skirpan, and produced by Bricolage Production Company in partnership with Probable Models. The production was a technology-enabled immersive theatre experience that invited audiences into the R&D labs of Aura—an imagined tech giant in the near future—to participate in the launch of a groundbreaking AI product: Amelia. Performances took place during eight weeks between September 2019 and November 2019 and hosted typically 50-60 people per night, six nights per week (see Figure 1).

f1.jpg
Figure 1. Amelia (center) is introduced to the audience at Aura's project launch event by the CEO (left) and director of research (right) during a performance of Project Amelia.

Prior to each performance, Aura's marketing director reached out to "invited attendees" of the product launch (that is, ticket holders). This initial communication included an online survey that was used to assign each person a "role" within the performance. They were also invited to link their social media accounts to an audience-specific database that was made available to technology installations and provided to actors to infuse into interactions within the performance.

On arrival, guests were checked in, asked to surrender their electronic devices, provided with a dedicated smartphone to take part in the social network of the narrative world, and given an RFID wristband that could be scanned to unlock their real-world data for experiences with several fictitious products.

The show began with tours of Aura's research labs. Each tour was framed for the role of the audience group—for instance, a group of board members would meet the CEO whereas journalists were shown key products and sold on the company's successes (see Figure 2). After the tours, everyone was called to an auditorium for the AI product launch: a sophisticated humanoid android named Amelia. Halfway into the presentation, a former employee arrived to interdict the company's big debut with a whistleblowing revelation that the company was unethically experimenting on users. While Aura executives scrambled to save face, a hacker planted in the audience tampered with Amelia. These two frictions set the stage for the remainder of the show, which had seven unique endings determined by the audience's choices in the remainder of the interactive scenes (see Figure 3).

f2.jpg
Figure 2. Audience members (middle, right) assigned the roles of journalists do some sleuthing with an actor playing a journalist (left) during a Project Amelia performance.

f3.jpg
Figure 3. Audience members (seated) have an up-close view of an interaction between the whistleblower (left) and Amelia (right) in Aura's laboratory during a performance of Project Amelia.

Back to Top

Themes of Privacy and Computing Ethics

This plot simulated a moment where a company's lack of concern for privacy and ethics led to real human impacts. Through the story itself, private moments between audience and actors, and interactions with technology, an array of themes related to privacy and computing ethics emerged to both help raise audience awareness and also provide an experience with some real (albeit fictional) stakes.

One of the key story elements revealed by the company whistleblower, Felicia, was the revelation that Aura was doing experiments to impact their customers' mental health and emotional well-being without any kind of oversight or consent. Felicia slowly leaks hints of an A/B test targeting the mental health of a group of users that ended with dire consequences for some. Those who followed Felicia's storyline participated in 30 minutes of actor-supported sleuthing to learn the details of an experiment eerily similar to the controversial Facebook Emotional Contagion Study.2 The commitment of the audience to understanding the whistleblower's truths and bringing them to justice is one of the main factors that determines the ending. This thread allowed for both interactive learning and deeper reflection on the role of whistleblowers in our society.

Another moment from the story showcased emerging issues in AI privacy and security: a hacker giving an adversarial input to Amelia. While the Aura execs were distracted, an actor planted in the audience approached Amelia and played an odd set of sounds. The attack was inspired by the infamous DolphinAttack using inaudible voice commands.4 The attack left Amelia in a state of confusion, unable to identify contexts, emotions, and solutions as the android had previously demonstrated. This moment was placed to not only act as a teaching tool for a rather complex and new security threat vector, but also to provoke questions about how AI and other autonomous systems should be governed.

The themes of privacy and ethics ran deeper than the storyline. Audience members were further engaged in interactive moments or throughlines that could allow them to spend time considering the questions being raised. For instance, two audience members per night were invited by the hacker to join "Cicada," an elite hacking organization trying to bring down Aura. Those who agreed were taken on a parallel shadow-track of the show that played out more like an escape room.d They searched for poorly hidden passwords, identified cameras living on insecure internal networks, and picked a lock to potentially shift the show's ending in favor of a hacker's revolt.

Other audience members were given moments of small-group time with Amelia. This often led to fascinating displays of people's inner hopes, fears, and misunderstandings of AI. Patrons would regularly ask Amelia, "[w]hat data are you accessing right now?" which offered Amelia ways to play around with the inferential possibilities of AI and predictive privacy (for example, using body language and tone to claim that Amelia inferred the person as 'threatening' or 'disturbed'). Sometimes Amelia asked the group whether AI was about to revolutionize the world. These moments would often spark impromptu discussion and debate among the audience.

Cast as board members or key stockholders, some attendees were brought into a boardroom to speak with the CEO following the whistleblower's accusations. They were tasked with deciding how aggressive the company should be toward Felicia. Their decisions impacted what happened in the coming scenes.

Nearly a dozen technology installations and fictitious "beta-products" were installed around the performance space. One product, Own Up, challenged people to take part in an entertaining privacy experiment. Audience members could check in to a machine that pulled from their recent social media history to display anonymized quotes of those playing the game on a large public projector screen. Those who decided to "own up" to what they said would walk to the center and press a button in front of everyone. The quotes that were unclaimed sat in a graveyard on the screen for all to consider. Another product, Aura Vision,e made a variety of inferences (often of questionable accuracy) about individuals using only their face, including age, gender, emotion, attractiveness, responsibility, and other attributes.

Back to Top

Audience Interactions and Reactions

Our research team developed an IRB-approved protocol for collecting audience data that we could use to evaluate the impact of the experience. We found immersive theater to be a challenging environment for conducting research that respects participant consent—how do you convey to an audience that has lived in an imaginary world for the evening that our request for consent to use show data for research purposes is actually real? Technical difficulties with the show's Wi-Fi network and low-end smartphones posed additional challenges to our planned automated data collection. We nevertheless managed to collect some data in the form of interviews and surveys after the performances. Only a small fraction of audience members answered in-depth questions about their privacy intentions and behaviors. While the show did not give most participants specific skills for managing their privacy, our data suggests it was highly successful in equipping participants with motivation and broader frameworks for discussing privacy and making decisions. For example, one audience member said, "I walked away from Project Amelia with more awareness of my presence on the Internet, determined I wanted to manage the information that circulates [online]."


Immersive theater and other narrative approaches can open new doors in online privacy education.


The inclusion of many different stakeholder characters and open dialogue equipped some audience members to pause and consider decisions that some would consider less privacy protective. Several weeks later, we asked some audience members to recall any privacy-related actions they took as a result of Project Amelia. One audience member explained how they decided to acquire a smart device, "I had always been kind of creeped out by smart speakers. But, I had the opportunity to get one for free and I thought I would try it out. Project Amelia brought up the ways technology can make life easier. While I don't completely trust the smart speaker algorithm, I decided I was okay with giving up that portion of privacy to have the connectivity."

Back to Top

Privacy Narratives Open the Door

Immersive theater and other narrative approaches can open new doors in online privacy education. Project Amelia afforded audiences and actors the opportunity to safely try on roles, behaviors, and opinions not available to them in everyday life. While how-to workshops and resources certainly have their place, technologists ought not underestimate the power of narrative and open-ended dialogue in effecting behavior change. In a survey, one audience member illustrated how the show impacted discussions with friends, "Project Amelia gave me a new way to initiate and frame conversations with family/friends. If you're talking about a theatrical production, people don't shut down as quickly. Once you're into the conversation, you can turn it toward reality."

Reflecting on the experience, our research team believes we could have been more successful in collecting data had we prioritized a smaller set of data targets and relied less on automated collection methods. Considerations for future researchers include:

  • Offer space for the audience immediately after the show in the form of talk-backs and other dialogues that help them unpack the experience and to understand and participate in further research and conversation. We implemented nightly talk-backs halfway through the production, which increased research engagement.
  • Low-tech collection methods can be as powerful as high-tech ones—a traditional interview booth or paper/email surveys may be better ways to get data with fewer points of failure.
  • Consider more research touch points that occur outside of the day of the performance such as surveys and the ability to sign up for conversations days or weeks after the event.

Our team hopes to see more collaborations similar to Project Amelia, where artists and writers produce interesting, engaging, and thought-provoking content, researchers and industry players explore important questions at the heart of societal living, the audience actively participates in exploring an invented world, and all collectively enjoy and learn from the experience. Project Amelia focused on important current privacy and ethical dilemmas, but this approach can be borrowed and applied to many exciting areas, using fictional worlds to thoughtfully and positively advance our real world.

Back to Top

References

1. Biggin, R. Immersive Theatre and Audience Experience. Palgrave Macmillan, Basingstoke, 2017.

2. boyd, D. Untangling research and practice: What Facebook's "emotional contagion" study teaches us. Research Ethics 12, 1 (2016), 4–13.

3. Candy, S. and Kornet, K. Turning foresight inside out: An introduction to ethnographic experiential futures. Journal of Futures Studies 23, 3 (2019), 3–22.

4. Zhang, G. DolphinAttack: Inaudible Voice Commands. In Proceedings of the 2017 ACM SIGSAC Conference on Computer and Communications Security (CCS '17). ACM, New York, NY (2017), 103–117; DOI:https://doi.org/10.1145/3133956.3134052

Back to Top

Authors

Michael Skirpan ([email protected]) is Executive Director at Community Forge and Special Faculty in the Institute for Software Research, Carnegie Mellon University, Pittsburgh, PA, USA.

Maggie Oates ([email protected]) is a creative technologist and privacy researcher working on OnlyBans, a game by sex workers about sex work and technology at onlybansgame.com. Pittsburgh, PA, USA.

Daragh Byrne ([email protected]) is an Associate Teaching Professor at the School of Architecture, Carnegie Mellon University, Pittsburgh, PA, USA.

Robert Cunningham ([email protected]) is the Vice Chancellor for Research Infrastructure and Professor of Electrical and Computer Engineering and School of Computing and Information, University of Pittsburgh, Pittsburgh, PA, USA.

Lorrie Faith Cranor ([email protected]) is Director and Bosch Distinguished Professor in Security and Privacy Technologies, CyLab Security and Privacy Institute and FORE Systems Professor, Computer Science and Engineering & Public Policy, Carnegie Mellon University, Pittsburgh, PA, USA.

Back to Top

Footnotes

a. See https://bit.ly/3nCzRZX

b. Immersive theater is in the family of dramatic methods that includes experiential learning, legislative theater, and futures work. Other technology-related works include Arizona State University's "Emerge: A Festival of Futures" (https://bit.ly/32aEdjg) and Science Gallery Dublin's Grow Your Own (https://bit.ly/3FA3tgw). For those who are curious, you might consult Dunne and Raby's 'Speculative Everything' (MIT Press, 2013) as well as the work of Super-Flux (https://bit.ly/33QpUB3), the Near Future Laboratory (https://bit.ly/3nCCg6V), the Situation Lab (https://bit.ly/3ADqzSF), the Institute for the Future (https://bit.ly/3qDMkOD), and the Center for Science and the Imagination (https://bit.ly/3qF6ojJ).

c. This episode of PBS televison series "Immersive World" shows details about the performance: https://to.pbs.org/33MP8QC

d. Escape rooms are group puzzle games that are usually done in staged environments where participants are locked in a room and must complete all puzzles, often related to an underlying story, in order to get out and ultimately "win."

e. A remake of the computer vision exhibit "Biometric Mirror" by Niels Wouters; see https://bit.ly/3GCls7k


Copyright held by authors.
Request permission to (re)publish from the owner/author

The Digital Library is published by the Association for Computing Machinery. Copyright © 2022 ACM, Inc.


 

No entries found