acm-header
Sign In

Communications of the ACM

Privacy and security in highly dynamic systems

Legal Programming


Internet and online technologies, such as automated trading platforms, shopping bots, and Web services, are seen as a stage toward a new technological paradigm, the highly dynamic system (HDS). These are evolving complex, dynamic and data-intensive systems, characterized by autonomous components (agent technologies), constant growth (pervasive and persistency data is collected, stored, and communicated), and dynamic negotiation among representations of multiple participants and stakeholders (with possibly conflicting requirements). In supply chain management systems such as those of retail chains, proto-HDS are evolving to use sensor-enhanced products, packaging, and transportation methods to automatically trigger transactions among participants (such as shelf or warehouse monitoring and repositioning) that are not personally authorized. In contrast with standard shopping, or even online shopping as provided by Peapod.com or Amazon.com, it is the automated and unconscious nature of the new HDS processes' interactions with the parties that raises increasingly challenging legal and social issues. Moreover, providing these services involves processing a large amount of personal or legally relevant data including physical position, personal profile (history of purchases, weekly or current shopping list, family details), shopping cart contents, credit card information, and contact details.

In the online realm, however, as well as HDS, many basic commercial online transactions do not comply with the law, especially in the European context where digital transactions are more heavily regulated than in the U.S., for example as to privacy, electronic communications, and online contracting. Even if they do stick to the letter of the law, they generally fail to meet industry codes of conduct (commercial emailing, banners, and pop-ups) and users' expectations of fair dealing. Transaction terms are unequal as the (technically) weaker party, usually the buyer, has to accept the terms of the other. Table 1 lists some potential infringements.

The characteristics of HDS—change, autonomy, growth, negotiation—exacerbate the potential for misuses and illegality. What's more, the degree of semiautomated and even automated processing in HDS can generate further risks due to programming errors, data inaccuracy, or security failures. A formulized complement to the technical world is necessary to address legal hurdles as will be shown here in the particular context of enhanced shopping and privacy.

Back to Top

In-Store Enhanced Shopping

To identify some of the barriers facing HDS, we research a scenario that, given its nature, poses some legal issues that are common across HDS [10]. In a personalized and information-intensive transaction stream shopping experience, customers within a supermarket interact with and purchase RFID-enhanced products via wireless devices and automated software applications. Merchants also use wireless communications, "active" products and automated technologies for improved store management and selling processes on the customer's shopping cart. Specific processes in the research scenario include RFID-based customer recognition and access to enterprise systems (or logging on via the supermarket's consumer interface), monitoring of the customer's position and shopping cart contents, profile updating, new products or services offerings based on shopping cart content data combined with this profile, and automated replenishment and repositioning. Different sensor systems may increase the quantity of data and the number of process "triggers," including CCTV combined with face recognition software, and RFID or video-based control of shopping cart movements. If such a scenario becomes technically feasible, socially acceptable, and legally compliant, there are many potential customer benefits, including personalized experience, better searching in the real world, less out-of-stock merchandise, lower costs due to more efficient supply change management practices, and new value-added intermediaries.

Particular outcomes, if appropriate safeguards are not implemented in this scenario, could include:

  • A competitor store, picking up the consumer's personal RFID tag from nearby (distance matters), may contact the customer via SMS or telephone claiming its products are less expensive.
  • Undesired products and services can be "pushed" at the consumer via the customer interface, based on data-mining the customer profile and real-time monitoring of the customer's position and shopping cart contents, regardless of any consent or rejection to receive such advertisements.
  • Acceptance of on-screen offers may lead to concluding unknown contracts to purchase undesired products, through wireless security leaks or "on-the-go" identity theft. Automated renewal of a weekly shopping list may lead to the same result.

This scenario is complete as the potential number of actors involved (store, advertisers, manufacturers, third-party service providers, digital notaries, financial intermediaries, and so forth), the openness of the environment, and the complexity of the transactions involve most of the players and raise most of the issues in HDS. This grocery-based scenario also poses the same challenges faced in other HDS, such as financial markets, advertising processes, logistics and transportation, so the analysis here is generalizable given the pervasive need to comply with underlying contract, privacy, intellectual property rights (IPR), tax, and other laws.

Back to Top

Legal Programming: Approach and Methodology

The concept of legal programming defines both technical and non-technical actions and methods to prevent a computer system such as an HDS from behaving illegally. While traditional programming often incorporates legal compliance as an afterthought (a privacy policy, an online contract) or ad hoc, subservient to the principal technical and business processes, our approach systematically embeds legal constraints higher up in the design, in the process model.

The first step in this methodology is to carry out a business process analysis of transactions [7]. This analysis involves clarifying within the firm's overall "system" the different agents that are involved in these processes, establishing their roles and relationships—providing a systematic and comprehensive description of the agents/roles, objects, and relationships of the processes in the system. Table 2 briefly illustrates this description for a set of processes in the scenario described here.

In the process model, activities are linked together with indications of the flow from one activity to the next. These indications specify the conditions and constraints on carrying out the process and executing the next. Examples include structural constraints that determine the static aspects of the model and action constraints that define its dynamic aspects. Within this model, constraints can be modeled in a variety of ways, for example through formal logic, static process constraints embedded within the business model, or more dynamically in explicit rules (such as workflow in XPLD, PIF, or other workflow language, or business rules languages). These include legal constraints, including rules for negotiation, notification and consent, contracting or payment functionalities: which data can be collected, on what conditions, providing identification or obtaining consent, and so forth.

This approach helps design and build an architecture and/or workflow that integrates legal compliance and other requirements for both human and automated processes, and also third-party services interacting with the system, even before programming starts. This approach also ensures that not only are subsequent systems legal, but that they may be independent from (or specifically include) laws and codes of conduct from different jurisdictions and sectors. For example, in the area of privacy, while Europe's laws are generally applicable to all sectors, in the U.S. privacy regulation tends to be sector specific (for example, finance and health). Establishing a method for embedding constraints at the process level will enable a business to differentiate or discriminate where and in what context they are being carried out and apply the relevant rules accordingly.

This approach is particularly useful in HDS, where computers gain in terms of "intelligence" through increased data collection and automation and can substitute to a certain extent human involvement. In the example scenario here, whereas previously the store may have promoted goods through displays and signs, the same processes are carried out by software—analyzing and processing the customer's position, profile, and the store's or third-party knowledge bases. Modeling HDS processes enables us to apply regulations regardless of whether the process is carried out by human or machine.

Process modeling can also apply to a wider organizational form, the business network [4]. The HDS transactions within this scenario are not just in-store, but also between retail supply chain participants customers, and other Web merchants. In this multiparty environment, the legal risks as illustrated here increase. "Networked business" process modeling that minimizes these risks can multiply the benefits of single company modeling.

Back to Top

Privacy Challenges

While the processes in the research scenario illustrated in Table 2 raise concerns in several areas of law (such as contract or IPR) the focus here is on privacy issues, which are a major cause for user uncertainty and a sense of lack of control. It is also important to note that while the focus here is on these specific issues, security mechanisms are fundamental to bear in mind when considering privacy-compliant HDS [1].

The European legal framework (European 1995 Personal Data Protection Directive) requires any computer system design to incorporate high levels of privacy protection in terms of information, security, confidentiality, and integrity. Most of the data processed within the scenario is personal data as defined by law, which regulates the collection, storage, processing, and transmission of this data, and applies particularly to data storage and mining, customer profiling, and automated decision making. Basically, these processes must comply with principles of purpose limitation, data quality and proportionality, social justification, transparency and security, and the implementing laws severely restrict the processing of sensitive data (such as religious or political views).

Beyond the now traditional privacy-invasive activities and processes in online commerce and even real-world shopping (such as CCTV, cookies, and credit card details), the HDS described here presents certain specific risks in relation to personal data processing. While the risks posed by the autonomy of certain software systems and, more so, the capacity for multiplied interactions with IT systems through the "augmented reality" scenarios may have been overblown,1 there are a number of legitimate concerns [6]. These include:

  • Wider coverage (through interacting with RFID-enhanced daily artifacts) at home and in the office and shops;
  • Loss of awareness, while data is collected from "disappearing computers";
  • More data collected from these sources, 24/7;
  • New data types (physical location, biorhythms); and
  • More processing (profiling, data mining).

This change in personal data processing heightens risks as it provides greater scope for breaching privacy regulation, including collecting, storing and processing this data without appropriate consent and/or for undisclosed purposes; unsupervised automated decision making; unauthorized communications to third parties; and security breaches. Note that governments are now requesting sophisticated personal data to be made available by information providers (for example, the U.S. Government subpoenaed Google for its international search records, highlighting the need to clarify what can and cannot be obtained and by whom). More specifically, using the process-based approach described here, the privacy issues—including those set out in the scenario described—are shown in Table 3.

Two issues are particularly interesting due to the pervasive characteristic of HDS: automation and profiling.

  • Automation is based on delegation: users must place a degree of confidence in the HDS—a belief that it will operate as programmed and it will not be attacked or leak information. While operating autonomously, the system may be required or be forced to reveal an increasingly wide variety and amount of information about the user that this person may not wish to be shared (contracting or credit card details, personal lifestyle data, and so forth). A mismatch between intention and actual processing leads to mistrust, increasingly so as ubiquitous systems gain more sophistication and reach, and processes within them evolve toward more autonomy.
  • Based on this automation and additional data from sensors, HDS may use increasingly detailed profiles to offer customized services or differentiated prices [9] and improve supply management. While users normally have a certain amount of discretion about how much personal data to reveal, with the large amount of data mining capacity today there is a potential for even more significant data collection and exploitation outside the subject's control (including illegal transfers to third parties) about the most sensitive personal matters, including finances, relationships, illnesses, insurance, and employment.

The multiplicity of parties involved in HDS also raises difficulties: damage or liability may result not just through the activity of the enterprise HDS but through interactions with data, files, hosts, or systems belonging to other entities. Determining the party ultimately responsible for causing the damage or apportioning responsibility is increasingly complex, especially where a number of causes could be equally responsible for the resulting damage.

Back to Top

Privacy Compliance in HDS Systems

First-line privacy compliance. Most privacy issues outlined here may not be difficult to solve in a closed environment where "Data Subject" notification and consent to the processing should not be problematic:

  • The customer can be notified and his/her consent obtained in relation to the processing through express notice and consent or eventually the automated expression of policies [11].
  • Inference systems processing customer profiles and behaviors (for extrapolating rules of behavior and general customer profiles and determining new goals) may do so working on anonymous or pseudonymous data.
  • Appropriate firewalls, passwords, and other standard digital security management processes for guaranteeing internal security.

However, there may be practical consequences relating to privacy policies and obtaining consent that would reduce HDS dynamics and automatism. Published privacy policies suffer from at least two problems. First, it is likely that in HDS there is no reliable way of proving that a certain privacy policy was declared. Second, there is no way to ensure that any privacy policy is actually carried out in relation to the data in question. Solutions for the first include mechanisms for maintaining policy data integrity, authenticity, and non-repudiation. The second can only be checked via privacy auditing, until personal data is tagged in a manner allowing the "Data Subject" to monitor the data processing [3].

Consent for personal data processing, such as that needed from the consumer in the preceding scenario, must be an unambiguous, freely given, specific, and informed indication of a person's wishes. With RFID-based systems, any automated consent inferred from the use of such technologies could be inadequate, even if the HDS processes the shopper's data automatically in response to certain prespecified circumstances (similar to P3P applications, with proposed use of automated and/or RFID-based identification and privacy negotiations [2]). The granularity of this process may be adversely affected by any automated learning processes, whereby the evolution between initial programming and evolved system may break any links of notification, consent, and causality.

Certain "Privacy Enhancing Technologies" (PETs)—tools and processes of digital processing to reduce or remove the privacy threats, such as anonymizers, data strippers, authentication systems, or interfacing with trusted third parties—have been suggested for ensuring confidential or secure processes within these systems. However, these technologies do not deal with the fundamental issue that it is the processes themselves that must be privacy compliant—not just their technological support and implementation. To ensure this, an approach that combines both technologies and business and legal methods is needed.

Legal programming for privacy compliance. Taking the issues raised by automation (including automated, inferred, or dynamically negotiated consent, system evolution through learning processes, and the potential gap between personal intentions and actual processing), a process-based approach should cause the HDS to incorporate at a high level, in its process model, constraints that ensure the processing complies with specifically determined privacy rules (regulation or codes of conduct). These constraints are applied to those processes that cause the mentioned problems: obtaining consumer's consent for collection and storage of personal data, its disclosure or transfer to third parties, restricted scope of automated decision making on the basis of client profile, verifying if the use of any data—including transfer to third parties—complies with the purposes set out in the corporate or specific policy. This is the case regardless of the eventual evolution of system capacities: the constraints themselves cannot evolve until the legal framework changes.

This approach should also help with the issue of ensuring that enterprise systems—whether traditional computing or HDS—comply with corporate privacy policies and data subject preferences attached to certain personal data. If the corporate privacy policy is modeled, the rules and constraints set out in the model and applicable to certain identified data (personal data) can be applied to all processes within the general corporate IT architecture. This will ensure that even autonomous processes that reason and negotiate their (commercial) behavior do so in accordance with (corporate and legal) principles and constraints embedded in the corporate data model—a form of legalizing or internal policing of the HDS processes.

In addition, data protection principles should be embodied in the legal texts themselves—along with other legal rules such as IPR and consumer protection principles—and could be modeled and translated into computer-understandable languages, such as UML and workflow representational languages. This model can then be applied to the corporate business process model, as outlined here, to verify its degree of compliance, independently of how the latter is then implemented through technology—whether traditional computing, software agents, or HDS.

Finally, modeling these privacy processes and embedding the legal constraints in the architecture should also increase interoperability between business applications (automated trading platforms, for example), to the extent that these applications share common policies and constraints (that can be matched), or can integrate third-party constraints into their own process model.

Back to Top

Conclusion

HDS for augmented shopping scenarios like the one used in this article are technically possible today and have the potential to provide significant value to the customer as well as cost reductions over alternative solutions. However, a distinct challenge created by the HDS technology paradigm is generating trust and confidence, especially with regard to personal data, confidentiality and security, and protection of property rights in virtual systems [8]. Mistrust of the user is heightened by the potential for privacy infringements outlined here (autonomous, data-intensive processes), whereby the user is uncertain about what the system is doing, and even whether the system is acting according to specification. Therefore, it becomes essential to find mechanisms such as legal programming to develop trusted relationships both between participants and between users and the "machine" [5] and incorporate legal compliance—specifically privacy compliance—at design time. Note that although the focus here has been at a particularly granular level on a specific transaction in an HDS retail scenario, this approach may be usefully used in similar projects, even if just to be aware of legal and social risks.

Back to Top

References

1. Basin, D., Doser, J., and Lodderstedt, T. Model-driven security for process-oriented systems. In Proceedings of the Eighth ACM Symposium on Access Control Models and Technologies (SACMAT 2003); doi.acm.org/10.1145/775412.775425.

2. Cranor, L., Langheinrich, M., Marchiori, M., and Reagle, J. The platform for privacy preferences 1.0 (p3p1.0) specification. W3C Candidate Recommendation; www.w3.org/TR/P3P/.

3. eXtensible Access Control Markup Language (XACML) Version 2.0; OASIS Standard, February 2005; www.oasis-open.org/ committees/tc_home.php?wg_abbrev=xacml.

4. Giaglis, G.M., Papakiriakopoulos, D.A., and Doukidis, G.J. An analytical framework and a development method for interorganizational business process modeling. International Journal of Simulation 2, 2 (2002), 5–15.

5. Kenny, S. and Borking, J. The value of privacy engineering. The Journal of Information, Law and Technology 2002; elj.warwick.ac.uk/jilt/02-1/kenny.html.

6. Langheinrich, M. Privacy by design—Principles of privacy-aware ubiquitous systems. In Proceedings of Ubicomp 2001, Springer-Verlag; www.inf.ethz.ch.

7. Malone, T.W. et al. Tools for inventing organizations: Toward a handbook of organizational processes. Management Science 45, 3 (1999), 425–443.

8. Sackman, S., Strüker, J., and Accorsi, R. Personalization in privacy-aware highly dynamic systems. Commun. ACM 49, 9 (Sept. 2006).

9. Struker, J., Sackmann, S., and Muller, G. Case study on retail customer communication applying ubiquitous computing. In Proceedings of the 2004 IEEE International Conference on E-Commerce Technology (CEC'04), 42–48.

10. Subirana, B. and Bain, M. Legal programming—Designing legally compliant RFID and software agent architectures for retail processes and beyond. Integrated Series in Information Systems, Vol. 4, 2005.

11. Yee, G. Using privacy policies to protect privacy in UBICOMP. In Proceedings of the IEEE 19th International Conference on Advanced Information Networking and Applications (AINA 2005—Volume II). (Mar. 2005, Tamkang University, Taiwan).

Back to Top

Authors

Brian Subirana ([email protected]) is an associate professor of Information Systems at IESE Business School in Barcelona, Spain and a visiting professor at the MIT Auto-ID Laboratories in Cambridge, MA.

Malcolm Bain ([email protected]) is a partner at LegisTICs law firm in Barcelona, Spain.

Back to Top

Footnotes

1See Ashton, K. Testimony to California State Senate Subcommittee on New Technologies, August 2003. RFID Privacy Workshop, MIT, Nov. 2003; Albrecht, K., RFID: Privacy and Societal Implications; and Kumar, R. Interaction of RFID Technology and Public Policy; www.rfidprivacy. org/agenda.php.

Back to Top

Tables

T1Table 1. Infringements in online transactions.

T2Table 2. General process analysis of transactions within the scenario.

T3Table 3. Identification of privacy risks per process.

Back to top


©2006 ACM  0001-0782/06/0900  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2006 ACM, Inc.


 

No entries found