acm-header
Sign In

Communications of the ACM

Communications of the ACM

Internet Privacy Concerns Confirm the Case For Intervention


soldier

U.S. soldiers carry around 20-40 pounds of batteries to support high-tech gear on some missions.

Credit: AFP / Getty Images

Cyberspace is invading private space. Controversies about spam, cookies, and the clickstream are merely the tip of an iceberg. Behind them loom real-time person-location technologies including intelligent transportation systems, geo-location, biometric identification, hard authentication techniques, and miniaturized processors embedded in plastic cards, anklets, watches, rings, products, product packaging, livestock, pets, and people.

It's small wonder that lack of public confidence is a serious impediment to the take-up rate of consumer e-commerce. The concerns are not merely about security of value, but about something much more significant: trust in the information society.

Conventional thinking has been the Internet renders laws less relevant. On the contrary, this article argues that the current debates about privacy and the Internet are the harbingers of a substantial shift. Because the U.S. has held off general privacy protections for so long, it will undergo much more significant adjustments than European countries.

Privacy is often thought of as a moral right or a legal right. But it's often more useful to perceive privacy as the interest that individuals have in sustaining a personal space, free from interference by other people and organizations.

Personal space has multiple dimensions, in particular, privacy of the person (concerned with the integrity of the individual's body), privacy of personal behavior, privacy of personal communications, and privacy of personal data. Information privacy refers to the claims of individuals that data about themselves should generally not be available to other individuals and organizations, and that, where data is possessed by another party, the individual must be able to exercise a substantial degree of control over that data and its use. (Definitional issues are examined in [6].)

Information privacy has been under increasing threat as a result of the rapid replacement of expensive physical surveillance by what I referred to in Communications over a decade ago as "dataveillance:" the systematic use of personal data systems in the investigation or monitoring of people's actions or communications [2].

Intensive data trails about each individual provide a basis for the exercise of power over them by public and private sector organizations. Profile data can be combined with sender-driven technologies, to push customized information at each individual, and thereby exercise significant influence over their behavior, and reduce their freedom of thought and action.

Along with its dramatic impacts for social and economic good, the mainstreaming of the Internet has accelerated the privacy-negative impacts of information technology. Specific Internet privacy issues are examined in [7]. Marketing, technological and provider imperatives are creating tendencies for hitherto anonymous events to be converted into identified transactions, resulting in yet more data trails available to be trawled and mined. To facilitate data consolidation, governments and corporations make spasmodic attempts to impose multipurpose human identifiers [3]. During 1998, for example, U.S. government agencies were actively seeking to establish two multipurpose national identification schemes, one building on consolidation of state driver licenses, and the other in the healthcare sector.

Back to Top

Privacy Protection and the Crisis in Public Confidence

An important implication of the definition of privacy as an interest is that privacy has to be balanced against many other, often competing, interests, of the individuals themselves, of other individuals, of groups, and of society as a whole. The balancing process is political in nature, involving the exercise of power deriving from authority, markets, or any other available source.

Against the ravages of technology-driven privacy invasion, natural defenses have proven inadequate. Data is increasingly collected and personalized. Storage technology ensures that it remains available. Database technologies make it discoverable. And telecommunications enables its rapid reticulation. Organizations are only faintly restrained by professional and industry association codes.

Cost contraints continue to diminish rapidly. In any case, economic limitations have simply not acted as a constraint. In the case of public sector data matching programs, for example, cost/benefit analysis is seldom performed voluntarily. There are many serious deficiencies in the few analyses that have become publicly available, and programs are continued even after they have clearly been demonstrated to be financially unjustifiable. In the private sector, many highly privacy-invasive practices are routinely undertaken by corporations precisely because they are economic from the organization's own, limited perspective. In the natural state, companies' interest in efficiency dominates consumer interests, and hence the exercise of corporate power over people is endemic.

Exercise of countervailing power by individuals has had limited impact. Perhaps the most effective campaign to date has been the public's inaction. Business and governments in most advanced countries have attributed the slow adoption of e-commerce to a severe lack of trust by consumers and small business in corporations and governments. Trust in e-commerce is dependent on multiple, interacting complex factors including consumer rights, freedom of expression, and social equity. This article directly addresses only the central element of privacy.

Back to Top

Should We Abandon Privacy as a Social Value?

Some government and corporate executives depict the increasing problems, and the inadequacies in the protective framework, as evidence that privacy is not only dead, but ought to be dead. They perceive the public is not to be trusted, and that the benefits of modern society must only be granted in return for substantially enhanced organizational access to personal data. The argument has been put forward in such contexts as the prevention of fraud on the public revenue, on credit grantors, and on insurance companies; the efficient gathering of taxes; and the efficient marketing of goods and services. Law enforcement and national security are also recurring themes.

Claims of this kind seem to be out of touch with the realities of the networked world. The influence of the nation-state is under serious threat because of the power of multinational corporations, regionalism, globalism, the Information Infrastructure, and the new patterns of information society and information economy.


Business and governments in most advanced countries have attributed the slow adoption of e-commerce to a severe lack of trust by consumers and small business in corporations and governments.


In the past, only the wealthy had ready access to transjurisdictionality (whereby transactions are contrived in such a manner that key elements occur in different countries or states) and extrajurisdictionality (whereby a key element of a transaction occurs in a "regulatory haven"). In the developed world, the Internet has dramatically changed the cost profile of such maneuvers, and with it their accessibility by organizations and people of lesser means.

The Internet is also enabling effective suprajurisdictionality, whereby acts are subject to no jurisdiction at all. The high seas have always required special legal treatment, and the law of space (such as the apportionment of liabilities for events that occur in earth-orbit) has added new challenges. The Internet creates the ability to contrive acts to take place in undefinable or undiscoverable geographic space, such that no courts (even of a powerful and bold country) could convincingly claim jurisdiction. Governments are thereby losing their existing power to impose their desires on their governed populations, and the threshold has slid down well below megacorporations and the seriously rich.

Meanwhile, corporations are disintegrating (in accordance with various fashions including outsourcing, downsizing, telecommuting, and virtualization), in order to take advantage of the economies of small-organization flexibility and adaptability, and owner-manager tendencies to underquote and overwork. This is less evident than it might be because ongoing concentration of financial control tends to mask the changes in organizational processes and structures. If these trends continue, power within western economies may come to be exercised through the scale and persuasiveness of alliances rather than of single corporations.

Meanwhile, the Internet offers new opportunities for consumers and citizens to exchange information about the behavior of organizations, and to organize actions against them. Corporations that seek to sustain collusive arrangements may be increasingly capable of withstanding government pressure, but less able to hold off consumer groups increasingly well organized through constructive use of the Internet.

In short, rationalist economic precepts may be convincing to government officials, and the business efficiency criterion may be axiomatic to corporate executives; but they may not be convincing to their citizen-consumer adversaries. If a powerful populace of the mid-21st century demands privacy, it might be quite capable of getting it. There is no irresistible force toward dehumanization. We can choose.

Back to Top

Is Ubiquitous Transparency A Better Form of Protection?

A less extreme attack on the importance of privacy is the argument for greater transparency, expressed most lucidly in Brin [1]. Based as much on rampant, uncontrolled growth in visual surveillance as on the Internet, Brin argues the technological imperative is irresistible; and that privacy protections are futile. He contends that privacy can only be sustained by focusing instead on freedom of information for everyone: to achieve privacy, rely on freedom, not secrecy.

Brin's argument can be succinctly expressed as:

  • Q: Who will keep a watch on the watchers?
    A: The watched.

His antidote is ubiquitous openness, with the powerful just as subject to visual and data surveillance as everyone else. Police will be judged by the viewers who (using the Internet) watch them watching others.

Brin's argument is based on the premise that the watchers will not exercise political power in order to preclude others from watching them. The history of societies suggests there have always been uneven distributions of power, and the powerful have had incentives, and in most cases the ability, to exercise their power, and to resist diminution of their power. It would appear that Brin's transparent society will only be achieved if the patterns repeated across millenia of human experience are able to be overturned in short order.

So his argument is undermined by the implicit presumptions that the less powerful are more powerful than the more powerful, that no one will succeed in establishing enclaves of privilege, and that the actions of all will really be able to be monitored by all. Brin's counterargument1 is that the powerful will only be as successful in avoiding observation as they already are in resisting privacy laws that offend their own interests.

This (necessarily cursory) examination of alternatives to privacy protection concludes that they cannot deliver what is needed right now, which is a solution to the crisis in public confidence in IT that is being brought to a head by the rapid growth and far-reaching impact of the Internet.

Back to Top

Can Corporate Innovation Solve the Problem?

To address the unwillingness of consumers to transact electronically, some industry associations have established codes of conduct and trademarks, supplemented by audits and amended terms of contract. A cross-industry initiative of importance is TRUSTe (see Benassi in this section). Another is the WebTrust initiative of North American accounting associations.

In parallel, technological measures have been proposed. Some anonymous and pseudonymous mailers and Web-surfing aids have been motivated by a desire to enable people to protect themselves against organizations. Others, however, have been driven by corporate appreciation that lack of confidence is bad for business, and undermines effective government. Examples of such tools are featured throughout this section.

An even more substantial standard has been developed by the business-funded World-Wide Web Consortium (W3C). The Platform for Privacy Preferences (P3P) is an especially important architectural innovation (see Reagle and Cranor in this section).

What these various initiatives add up to is an emergent movement to recognize a form of intellectual property (IP) rights in personal data, vested in the individual that individual can trade. The nature of such IP rights will need to be significantly different from existing models like copyright, patent, trademark, and designs. It will need to establish a default of individual control, but then envisage means whereby licences to use data may be granted by the individual subject to qualifications of the individual's choosing. It will also have to permit measured and explicit compromise of those IP rights by legal authority.

The idea of tradeable IP rights offends the purist notion of an inviolate individual, because it implicitly acknowledges the dominance of the economic model of humankind over the social perspective. On the other hand, it may well establish a more effective basis for the protection of individual rights than has previously been available. Moreover, although targeted at economic relationships, such property rights may quickly migrate toward governmental contexts as well. Public confidence in governments is under serious challenge because of their increasing capability and capacity to submit their populations to data surveillance. Property rights would reverse the onus, forcing governments to be explicit about precisely what compromises to the IP were required by law; which would in turn bring into public focus the justification offered for each legal incursion.

Much of the developed world has progressively legislated broad Fair Information Practices (FIPs). Two decades ago, these were codified in the OECD's 1980 Guidelines (see sidebar and [10]). The European Union recognizes the need for a tightening of the provisions, and a Directive came into effect in October 1998 [8]. In most economically advanced countries, particularly within Europe, OECD-compatible laws apply to the public sector, and in many they apply to the private sector as well [9].

Regarding the virile U.S. private sector, however, successive governments have heeded the demands of business, and resisted calls from the public for effective privacy protection laws. That stance has been based on the presumptions that economic efficiency is the greater good; that the market is capable of solving all problems; that administrative efficiency is a protector of privacy; and that what economists call "moral suasion" is all that is needed to encourage corporations to act responsibly.

In the absence of a unifying framework, issues that catch public attention lead to sporadic, knee-jerk legislation. The U.S. has a vast number of statutes that impose privacy regulation, both at the federal level, and in most states [12], and inconsistencies abound. For example, video-rental records are subject to significant protections, whereas a vast array of more significantly sensitive personal data is not.

Back to Top

The Necessary Regulatory Framework

Market forces are subject to many imperfections. It is in the economic self-interest of individuals and corporations to exploit those imperfections, and to generate new ones. Not surprisingly, self-regulation has continually demonstrated itself to be inadequate, and only of value if it is instituted within a context. Although the Internet creates the prospect of coordinated consumer and citizen action, it would be premature to anticipate the present imbalance of power between organizations and individuals will be overturned soon. Hence, it is unrealistic to expect privacy to be adequately protected in the absence of intervention into government agency and marketplace behaviors.

Corporate marketing activities on the Internet have brought this need sharply into focus. During 1995–98, the U.S. Federal Trade Commission examined the behavior of corporate Web sites. In June 1998, the FTC concluded "the Commission has not seen an effective self-regulatory system emerge." It recommended "that Congress develop legislation to require commercial Web sites that collect personal identifying information from children 12 years of age and under to provide actual notice to the parent and obtain parental consent." The FTC had no difficulty finding a Senator to sponsor the Bill, and it made clear this was very likely to be merely the first installment.2 Meanwhile, the Clinton Administration has been adapting its stance.3 Advocating statutory privacy protections in the U.S. has long been politically risky. During 1998, advocating statutory protections rapidly mutated into a politically desirable stance.

It might seem incongruous to advocate legislative intervention at a time, and in a context, in which the capacity of nation-states to enforce laws is decreasing. But the power of nation-states is decreasing, not vanishing. In the information society and economy, law, like location, will still matter. In what may well be turbulent times in the early 21st century, individuals and corporations alike will seek to operate in locations where they and their staff are relatively secure and relatively confident in the rule of law.

Privacy protections demand a multitier approach, involving individuals, organizations, industry associations, and governments, operating within a legislative framework. Comprehensive organizational, procedural and technical measures are necessary, back-ended by mechanisms that exercise control over non-compliant organizations. Legal stiffening is needed behind self-regulatory arrangements in order to encourage compliance (by creating economic and social incentives for "good citizenship") and to discourage non-compliance (by creating social and economic disincentives, such as higher cost-profiles, and formal sanctions). The model, outlined in the sidebar, has been usefully described as "coregulatory." An effective implementation is to be found in New Zealand.4

The features outlined are not alternatives, but are mutually interdependent. All need to be in place, and need to be in place before netizens and small- business people will be confident about their electronic transactions with large corporations and with government.

Back to Top

Beyond Fair Information Practices

FIP principles, which originated in the late 1960s, appeared to be appropriate to the expanded data processing capabilities of the time. However, they have proven to be totally unadaptive to the ravages of technological advance, and entirely inadequate for the dramatically more powerful, network-based IT of the 2000s. The precepts on which privacy-protective infrastructure for the 21st century needs to be built must transcend the limited FIP principles.

Although the OECD is currently discussing the application of its 1980 Guidelines to the Internet, it asserts they do not require revision [11]. The OECD's motivation was always primarily economic rather than social, and it is under pressure from several quarters to ease the restrictions on access to personal data, rather than to enhance protections.

The current OECD stance is untenable. The information privacy principles on which regulatory regimes are based must be extended beyond mere FIP in the following ways:

  • Organizations must provide publicly available justification for privacy-invasive information systems, purposes and uses of data;
  • Choice must be offered among anonymous services (except in those rare circumstances where they are impractical, and for transactions that a relevant legislature votes to preclude from being conducted anonymously), pseudonymous services (with organizational, technical, and legal protections against discovery of the person behind the pseudonym), and identified services [5];
  • Multiple usage of identifiers must be precluded, despite the apparent sacrifice in economic efficiency, in order to render the collation of information about individuals from multiple sources much more difficult [3];
  • Control over identification and authentication tokens (such as chip cards and digital signature keys) must be exercised by individuals, and choice must be available as to which organization issues them; and
  • The scope of privacy protections must be broadened to include all dimensions of privacy, because personal space as a whole is threatened by visual and data surveillance and by such Internet realities as stalking, locator services, and biometric schemes.

If a powerful populace of the mid-21st century demands privacy, it might be quite capable of getting it. There is no irresistible force toward dehumanization. We can choose.


Back to Top

Conclusions

The concept of privacy is traditionally traced to an article by U.S. Supreme Court justices at the close of the 19th century. It has come into sharp focus since the mid-20th century because of information-intensive practices supported by rapidly evolving information technologies.

At the close of the 20th century, the Internet is having a profound impact on aspects of our lives as diverse as our places and patterns of work, the means whereby we interact, with whom we interact, and the cultures within which we live. It should not be surprising that the Internet's impacts and effects on freedoms, and on the concepts underlying our laws, are profound as well.

Privacy is one of several interests in information that are greatly affected by the Internet. These interests need to be reconsidered in the context of the now well-established notions of information economics, and the emergent concept of information law. A form of intellectual property rights in data about oneself needs the opportunity to mature very quickly.

Privacy has always been about trade-offs, and information law will involve the formalization of balancing processes between ownership and access, and between freedoms to know, to publish, and to express on the one hand, and freedoms to be, to hide, and to deny on the other. The information economy is dependent on trust. Trust must be earned, and intrusion-permissive and intrusion-enabling arrangements preclude trust.

Privacy is both sustainable and a necessary focal point of the information society, first as a means of resisting the commoditization of human beings, and secondly as a means of enabling e-commerce and electronic service delivery. Industry self-regulation and the development and application of privacy-enhancing technologies are necessary, but they are not sufficient. This article has outlined the necessary privacy-protective framework.

It has been argued the principles around which this framework revolves must extend well beyond the outdated set codified in the 1980 OECD Guidelines in order to cope with the last quarter-century's dramatic enhancements to IT capabilities and capacity.

The Internet is continuing to release a great deal of pent-up energy in areas as diverse as interpersonal relationships, the processes (and even the very concepts) of community, and the processes of commerce. The threats it embodies for individuals' interest in sustaining a private space are severe. The dam wall is breaking: Americans must now join the rest of the world in accepting that legislation and a publicly funded watchdog are essential elements within a privacy-protective framework for the information society and economy.

Back to Top

References

1. Brin, D. The Transparent Society. Addison-Wesley, Reading, Pa., 1998.

2. Clarke, R. Information technology and dataveillance. Commun. ACM 31, 5 (May 1988); www.anu.edu.au/people/Roger.Clarke/DV/CACM88. html.

3. Clarke, R. Human identification in information systems: Management challenges and public policy issues. Info. Tech. & People 7, 4 (Dec. 1994); www.anu.edu.au/people/Roger.Clarke/DV/HumanID.html.

4. Clarke, R. Privacy and dataveillance, and organizational strategy. In Proceedings of EDPAC'96. (Perth, Australia, May 28, 1996); www.anu.edu.au/people/Roger.Clarke/DV/PStrat.html

5. Clarke, R. Identification, anonymity and pseudonymity in consumer transactions: A vital systems design and public policy issue. In Proceedings of the Conference on Smart Cards: The Issues. (Sydney, Australia, Oct. 18, 1996); www.anu.edu.au/people/Roger.Clarke/ DV/AnonPsPol.html

6. Clarke, R. Introduction to Dataveillance and Information Privacy and Definitions of Terms. (Aug. 1997); www.anu.edu.au/people/ Roger.Clarke/DV/Intro.html

7. Clarke, R. Information Privacy on the Internet: Cyberspace Invades Personal Space. Telecomm. J. Australia 48, 2 (May/June 1998); www.anu.edu.au/people/Roger.Clarke/DV/IPrivacy.html.

8. European Commission. The directive on the protection of individuals with regard to the processing of personal data and on the free movement of such data. (Brussels, July 25, 1995); www2.echo. lu/legal/en/dataprot/directiv/directiv.html

9. Global Internet Liberty Campaign. Privacy And Human Rights: An International Survey of Privacy Laws and Practice. (Sept. 1998); www.gilc.org/privacy/survey/

10. Organization for Economic Cooperation and Development (OECD). Guidelines on the Protection of Privacy and Transborder Flows of Personal Data. (Paris, 1980); www.oecd.org/dsti/sti/it/secur/prod/PRIV-en.HTM.

11. Organization for Economic Cooperation and Development (OECD). Implementing the OECD Privacy Guidelines in the Electronic Environment: Focus on the Internet. Committee for Information, Computer, and Communications Policy. (Paris, May 1998); www.oecd. org/dsti/sti/ it/secur/news/.

12. Smith R.E. Compilation of State and Federal Privacy Laws. Privacy J. Providence RI, 1997.

Back to Top

Author

Roger Clarke ([email protected]) is Principal, Xamax Consultancy Pty Ltd., Canberra, and Visiting Fellow in the computer science department of Australian National University.

Back to Top

Footnotes

1Private communication, June 30, 1998

2www.ftc.gov/reports/privacy3/toc.htm

3www.epic.org/privacy/laws/gore_release_5_14_98.html

4Privacy Act 1993; www.knowledgebasket.co.nz/privacy/legislation/legislation.html

Back to Top

Back to Top


©1999 ACM  0002-0782/99/0200  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.


 

No entries found