acm-header
Sign In

Communications of the ACM

Communications of the ACM

Inside Risks: Risks of Content Filtering


The Internet and World Wide Web may be the ultimate double-edged swords. They bring diverse opportunities and risks. Just about anything anyone might want is on the Net, from the sublime to the truly evil. Some categories of information could induce argument forever, such as what is obscene or harmful, whereas others may be more easily categorized—hate literature, direct misinformation, slander, and libel.

Proposed legal sanctions, social pressures, and technological means to prevent or limit access to what is considered detrimental all appear to be inadequate as well as full of risky side effects.

Web self-rating is a popular notion, and is being promoted by the recent Internet Content Summit as an alternative to government regulation. The ACLU believes both government intervention and self-rating are undesirable, since self-rating schemes will cause controversial speech to be censored, and will be burdensome and costly. The ACLU also points out that self-rating will encourage—rather than prevent—government regulation by creating an infrastructure necessary for government-enforced controls. There's also a concern that self-rating schemes will turn the Internet into a homogenized environment dominated exclusively by large commercial media operations [1]. Furthermore, what happens to sites that refuse to rate themselves, or whose self-ratings are disputed?

The reliability of third-party filtering is notoriously low. As noted in the Risks Forum, sites such as middlesex.gov and SuperBowlxxx.com were blocked simply due to their domain names. Commercial site-censoring filters have blocked NOW, EFF, Mother Jones, HotWired, Planned Parenthood, and many others [1]. The Privacy Forum was blocked by a popular commercial filter when one of its raters equated discussions of cryptography social issues with prohibited "criminal skills." Sites may not know they're blocked (there usually is no notification), and procedures for appealing blocking are typically unavailable or inadequate.

In a survey comparing a traditional search engine with a popular "family-friendly" search engine, the Electronic Privacy Information Center attempted to access such phrases as "American Red Cross," "San Diego Zoo," "Smithsonian," "Christianity," and "Bill of Rights." In every case, the "friendly" engine prevented access to 90% of the relevant materials, and in some cases 99% of what would be available without filters [1].

The Utah Education Network (www.uen.org) used filtering software that blocked public schools and libraries from accessing the Declaration of Independence, the U.S. Constitution, George Washington's Farewell Address, the Bible, the Book of Mormon, the Koran, all of Shakespeare's plays, standard literary works, and many completely noncontroversial Web sites [1]. Efforts to link federal funding to the mandatory use of filters in libraries, schools, and other organizations are clearly coercive and counterproductive.

With respect to children's use of the Internet, there is no adequate universal definition of "harmful to minors," nor is such a definition ever likely to be satisfactory. Attempts to mandate the removal of vaguely defined "harmful" materials from the Internet (and perhaps the next step—bookstores?) can result only in confusion and perhaps the creation of a new class of forbidden materials that will become even more sought after.

Parents need to reassert guidance roles that they often abdicate. Children are clearly at risk today, but not always in the manner that some politicians would have us believe. Responsible parenting is not merely plopping kids down in front of a computer screen and depending on inherently defective filtering technology touted as allowing them to be educated while "protecting" them.

As always in considering risks, there are no easy answers, despite the continual stampede to implement incomplete solutions addressing only tiny portions of particular issues while creating all sorts of new problems. Freedom of speech issues are particularly thorny, and seemingly among the first to be sublimated by commercial interests and seekers of simplistic answers. Filters and Freedom [1] is an extraordinary collection of information on these topics and should be required reading.

We must seek constructive alternatives, most likely nontechnological in nature. However, we may ultimately find few, if any, truly workable alternatives between total freedom of speech (including its dark side) and the specter of draconian censorship. With the Net still in its infancy, we haven't begun to understand the ramifications of what will certainly be some of the preeminent issues of the next century.

Back to Top

References

1. Filters and Freedom: Free Speech Perspectives on Internet Content Controls. David L. Sobel (Ed.); www.epic.org/filters&freedom/.

Back to Top

Authors

Lauren Weinstein moderates the Privacy Forum Digest.

Peter G. Neumann moderates the ACM Risks Forum.


©1999 ACM  0002-0782/99/1100  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: