The World-Wide Web is a powerful business technology that has attracted the attention of organizations ranging from the largest multinational corporations to the smallest mom-and-pop local businesses. Led by promises of anytime-anyplace contact with customers and suppliers, dissemination of information worldwide through a modest investment of resources, and the ability to conduct e-commerce efficiently and reliably, many organizations have sought to deliver an improved level of service or competitive advantage by way of services delivered on the Web.
But organizations setting out to create a Web service first need a thorough understanding of the related costs and performance issues in developing such services. Especially in smaller organizations, developing an appropriate business plan often presents problems. Because Web technology is relatively new, owners and managers frequently feel they lack the perspective and experience to accurately assess these parameters [5]. At the same time, the technology is relatively easy to work with; many an organization's technical and programming staff has already developed a fledgling Web service that may look attractive. But in part because these services are so easy to set up and use, many organizations create and introduce them to the public without a thorough business analysis. As a consequence, many Web services are incomplete, fail to meet their organization's goals, and cost more money than expected for development and maintenance [3, 9, 11].
Organizations that want to understand their business choices need a good understanding of the expected costs and benefits of developing their Web services. Here, we describe a set of support tools to help assess these important factors and discuss our experience using them in a number of public-sector organizations. Each of these tools is designed for simple and straightforward use; they might even be characterized as crude. In fact, initially, many of the organizations we worked with expressed skepticism about whether the tools would support effective decision making and yield accurate estimates. However, based on follow-up interviews with these organizations and feedback from others who have used the model in other settings, we have determined that the tools do indeed support appropriate decision making about Web development.
Other findings from the interviews include the following development, performance, and management guidelines:
Applying the tools to a planned project should give smaller organizations a new and valuable perspective on the process of developing a useful Web service. A more complete description with case studies is available, along with a more comprehensive set of guidelines for supporting the Web-service decision-making process [2].
The model we describe is primarily for organizations that have not yet decided whether to deliver a Web service. The model is also appropriate for organizations considering expansion of their Web sites with additional services, regardless of their size. Although the project that led us to develop these tools involved public-sector organizations, the approach applies with equal weight to the private and the not-for-profit sectors as well.
The expected costs of human resources were consistently higher than the costs for technical infrastructure and other technology factors.
The model, which consists of the three tools, is designed to help organizations narrow the range of their options in developing Web services (see Figure 1). The tools are part of a framework that examines specific discrete levels of service. The first toolthe system features and functionality worksheethelps identify the business goals a Web service has to serve, as well as the delivery mechanisms that will be used to support the service. This tool provides a framework for making decisions supported by the model. The second toolcalled the performance worksheetidentifies the important benefits and performance factors that would be affected by the services defined through the first tool. These factors are fleshed out as performance variables, measures, and targets. The factors give an organization a method for defining in detail the goals of a service, as well as a framework for measuring whether the service meets these goals after its implementation. The third toolthe cost worksheethelps address a comprehensive set of cost areas and calculates a rough estimate of the system's costs. Though any one of the three tools can be used alone or be customized to fit a specific organization's process of system development, the three together are complementary, providing a comprehensive perspective of the planned system.
An organization planning a Web service has to choose among a variety of potential services to offer its intended users. The primary goal of the planning process is to identify a range of choices, then select the most promising one for development. In the model, the planners are asked to define three levels of servicemodest, moderate, and elaboratein order to lay out a suite of assessment options.
A modest level of service would represent a minimum investment for an initial Web service covering a few organizational goals. It might, for example, include a set of information pages describing the services the organization offers users by way of other channels. At the moderate level, the plan might include additional features and a wider range of internal and external information sources. An elaborate level would correspond to a very ambitious project, the most the organization could hope for, possibly including a range of Web-based services with technically sophisticated design goals. All three levels should be consistent with the resources that would realistically be available for the project.
What goes into the three levels of service depends on an organization's goals and resources and is always a subjective process. At this point in an organization's decision process, there is no need to analyze the service levels; that comes later. An organization may also define more than three levels, but we feel it is important to start with at least these three. Exploring different levels of service puts an organization in a more favorable position to make appropriate choices.
In order to apply the three tools, the model involves three steps, described in the following paragraphs:
Identifying system features and functionality. The first step in making a decision is to be as explicit as possible about the features and functionality of the three levels of potential Web-based service the organization is examining. The system's developers use the system features and functionality worksheet (see Table 1) to develop a simple functional specification for the three levels of system services, helping make the estimates made in subsequent steps more concrete. This system specification is a simple tool that may be supplemented with the method the organization usually uses for specifying services or with an alternate method for understanding and specifying Web services [7].
Assessing and measuring performance. The second step is to characterize the major benefits that would result from developing the Web service. Tangible "process," or "service," benefits typically fall into three performance categoriesbetter, cheaper, and fastercompared to alternative delivery mechanisms. In addition to these direct process benefits, indirect benefits may include increased public visibility for the organization and improved staff morale.
The performance worksheet we developed focuses on process benefits in order to simplify the analysis and concentrate attention on the service, rather than on the mechanisms used to deliver it. If possible, an organization should characterize the benefits of a Web service in terms of "outcomes" and "results," rather than "outputs." For example, it might think in terms of how how its future Web site and service will change the lives of its business partners and customers, rather than how many hits its Web pages will receive. While outcome measurements are often more difficult to quantify, especially in the complex world in which a service will operate, focusing on the end results can help clarify objectives and sharpen efforts.
It is important to define the organization's expectations explicitly so that, after a service is operational, the organization can determine whether its expectations were met. Concrete statements of how customers will be served help build realism into these plans. Moreover, involving a variety of members of the organization in the analysis gives the project a better foothold in the organization. Through this involvement, ideas are subjected to further scrutiny and discussion, clarifying expectations and fine-tuning the organization's efforts.
Explicit, objective measures are preferredif it is possible to think about an effect in concrete terms. Some measures, such as customer satisfaction, are more difficult to assess. On the other hand, if an organization does not already collect opinions from customers or constituents, doing so in a Web survey might be a good starting point. Because the Web service may affect a variety of stakeholders, including customers, employees, and cooperating organizations, such effects should be kept in mind.
In the planning stage, it is useful to collect baseline data about the present situation, as well as forecast the results expected after the service is in place. In our use of the tools, we asked the various user organizations to discuss and agree on targeted measures of performance for their own service objectives. These measures could then be used to determine how well the service works after it is developed. In our experience, no one felt entirely comfortable with their forecasts, but all agreed they were using their most informed judgment at the time they were figuring them out. For such estimates, informed, consensual judgment from a group of knowledgeable managers and technical experts is probably the best forecasting approach available.
Targeted measures of performance are developed for each of the functionality levelsmodest, moderate, and elaborate. It is usually sufficient to identify only the most important performance variables; many of the important effects might be identified only after the service has been running a while. See Table 2 for a sample filled-in worksheet for identifying performance variables, measures, and targets. More information about developing performance measures is in [6].
Assessing the costs of developing and delivering Web services. The third and final step is to estimate what it will cost to develop and maintain a Web service. This estimation may be difficult for organizations that have not embarked on significant technology projects before the one involving Web services. Because the Web is easy to use, people often underestimate the cost of developing an effective Web service, sometimes by as much as a factor of four [4, 10]. At the same time, it is not uncommon to read published reports of organizations having to pay millions of dollars to develop their Web services.
A technology-savvy organization can develop an effective Web service with a relatively low technology investment, especially if the project is outsourced. On the other hand, an expensive effort may be needed to create and integrate all the information for a Web service, especially in larger organizations in which a number of information sources may have to be coordinated to deliver an effective service. Advanced service features may require specialized programming or access to the organization's databases, further multiplying the cost of developing and operating the service.
We have found it necessary to identify as many of the costs as possible, even when they could not be calculated with certainty. To make estimation of these costs as straightforward as possible, a cost worksheet was developed to support planning. The cost framework is intended to be comprehensive but not overly burdensome, targeting a level of detail appropriate for smaller organizations.
The costs are broken down into five categories (see Table 3), discussed in the following sections, each specified with start-up costs and annual maintenance costs. An organization should estimate each of them for its proposed systems. The categories include not only the costs for technical design and system development, but also for making such decisions as which users a system would target and what the service would provide. Although such planning costs are easily ignored, they should indeed be included since they take resources away from other potential system development activities.
Organizational readiness. Organizations' preparedness for taking advantage of Web technology can vary tremendously. Helping all levels of an organization's staff learn about the technology may be necessary to support meaningful discussion of the merits of a proposed service.
Access for employees and other users. Some employees may be designated to use the Web service, others to develop its content, and still others to provide its technical support. All of them need access to various technologies, as well as to the Web itself.
End-user support. An organization's employees and the system's external users may need training and help-desk support to make effective use of the new Web resources.
Content development and maintenance. Developing a suite of information and services to be provided on the Web requires, at a minimum, converting existing information into a form that can be delivered by Web servers. If the intended application involves two-way communication or advanced applications, the cost of developing the service may be substantial.
Hosting site infrastructure. Once the content is ready to be installed on the organization's Web site, available for access via the Internet, a system containing a Web server and space to store the information has to be available. This content storage space and support may be acquired through outsourcing or through the organization's connections to the Internet.
The cost worksheet is useful for planning the evolution of a Web service. Using it, organizations should assess rather explicitly what the start-up costs might be for the three different versions of the three levels of servicemodest, moderate, and elaborate. It sometimes makes a great deal of sense to make substantial one-time investments targeting an elaborate level of service objectives from the start. In other situations, first-year costs can so be daunting that a relatively modest investment may be more realistic. The latter view can make sense in light of reports that some investments in Web services may not return a profit for years [1].
It is never too early to begin the analysis, although comprehensive estimates may not be available until some initial prototyping is done. The development team should focus on the most expensive resources. The numbers should be refined as the organization gains a greater understanding of the issues. Though it is often difficult in practice to produce accurate assessments, the payoff in terms of reduced uncertainty is worthwhile [12]. (Detailed instructions for each cell in the worksheet are available from the authors [2].)
Once the cost and performance assessments are made, it's time to decide on an appropriate level of investment. It is sometimes quite difficult to draw an obvious conclusion from all the available information. Standard decision-support approaches can be used [2]. A simplistic approach would be to try to quantify all benefits in dollar amounts and divide costs by benefits to identify which system provides the greatest benefit for the least cost. This simplistic approach may be useful for organizations whose primarily concern is cost. Typically, a more complex decision framework may be necessary; if the organization has a long list of performance criteria, a multiattribute utility model or a resource allocation method may be useful. (A bibliography of useful resources is available in [2].)
The tools described here were developed in a project run by the Center for Technology in Government (CTG) at the University at Albany of the State University of New York. Seven public organizations worked in a joint effort to create individual Web services. The organizations were neither large nor small, were relatively new to Web technology, and had limited experience developing such services. The services they planned ranged from straightforward Web dissemination of information to online ordering of agency materials directly through the Web. The project participants formed a network organization to create their Web services, with CTG helping them explore the relevant issues [7, 8, 10]. The decisions the agencies ultimately made were supported by earlier versions of our tools.
Each of these organizations has since gone live with their Web services, incorporating features relatively close to those they had planned in the project. When we interviewed the participants a year later to assess the effectiveness of the tools, we were mainly looking to determine whether the tools effectively supported the agencies' decision making, even those with limited previous experience with Web technology.
During the project and in the follow-up interviews, the agencies' representatives expressed their discomfort about making the predictions called for in the model's performance assessment and measurement and cost assessment due to their inexperience in developing Web services. In spite of this difficulty, they felt the decisions they did make were appropriate and were happy with the results.
The tools are actually something of an artificeintended to get the team talking, clarifying ideas, building project awareness and ownership, pointing out differences in expectations, identifying additional organizational units that should be included in the process, and identifying reasonable expectations for the new service. Although the cognitive gap between objectives and tasks was evident to the agencies' representatives, our one-year follow-up interviews convinced us that the tools had indeed achieved their intended goals.
In analyzing the agencies' estimated and actual development costs, we found that personnel and technical infrastructure costs represented the bulk of their Web-development expenses. The cost of such items as Web development tools and Web servers was usually small compared to the human effort needed to define and develop the content of the service and the base level of technology these organizations have to have. For these agencies, the expected costs of human resources were consistently higher than the costs for technical infrastructure and other technology factors (see Table 4). For example, for a modest service, the human resources costs ranged from 3.9 to 13.7 times the costs of technical infrastructure and other things.
Even though the Web, and even the Internet, are relatively new technologies, we recommend that organizations planning to deliver Web services proceed along the same lines as their other technology projects: define the service they want; describe its expected effects; estimate the costs to develop and maintain it; and make a decision. Using our method and its tools, organizations should be able to perform an analysis sufficient for making sound investments that will achieve desired results, even as directions change in light of the Web's own rapid evolution.
1. Bass, B. and Eichler, S. Content Profit Models. People and Technologies Report, Forrester Research, Cambridge, Mass., 1996.
2. Bloniarz, P. and Larsen, K. A Cost/Performance Model for Assessing Web Service Investments. Center for Technology in Government, Albany, N.Y., 1997; see www.ctg.albany.edu/resources/rptwplst.html.
3. Clark, D. Facing losses, some Web publishers fold. Wall St. J. (Europe) (Jan. 14, 1997), A1A8.
4. International Data Corp. The Marketeer's Internet: Motivation, Cost & Customization. Report, IDC, Framingham, Mass., 1996.
5. Kemerer, C. An empirical validation of software cost estimation models. Commun. ACM 30, 5 (May 1987), 416429.
6. Kinghorn, J., Morgan, C., et al. Information Management Performance Issues: Developing Performance Measures and Management Controls for Migration Systems, Data Standards, and Process Improvement. Report, National Academy of Public Administration, Washington, D.C., Jan. 1996.
7. Larsen, K. and McInerney, C. Using the Cohort model in development of Web sites and Web policies. In Proceedings of the Association for Information Systems Americas Conference (Indianapolis, Ind., Aug. 1517), Association for Information Systems, Pittsburgh, 1997, pp. 236238.
8. Nouwens, J. and Bouwman, H. Living apart together in electronic commerce: The use of information and communication technology to create network organizations. J. Comput.-Med. Commun. 1, 3 (1995), 119.
9. O'Reilly & Associates. Conducting Business on the Internet. Report. O'Reilly & Associates, Cambridge, Mass., June 21, 1996.
10. Powell, W. Neither market nor hierarchy: Network forms of organization. In Markets, Hierarchies and Networks, G. Thompson and J. Frances, Eds. Sage Publications, London, England, 1991, pp. 265-276.
11. Scott, H. Out of the Web. Mag. for Mag. Mgnt., 26, 7 (May 1, 1997), 50.
12. Vigder, M. and Kark, A. Software Cost Estimation and Control. National Research Council Canada, Ottawa, Canada, 1994.
This research was conducted as part of the Internet Testbed Project at the Center for Technology in Government.
Table 1. Worksheet for system features and functionality.
©2000 ACM 0002-0782/00/0200 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2000 ACM, Inc.