acm-header
Sign In

Communications of the ACM

Communications of the ACM

The State of -ser-Centered Design Practice


User-Centered Design (UCD) is a multidisciplinary design approach based on the active involvement of users to improve the understanding of user and task requirements, and the iteration of design and evaluation. It is widely considered the key to product usefulness and usability—an effective approach to overcoming the limitations of traditional system-centered design. Much has been written in the research literature about UCD. As further proof of internationally endorsed best practice, UCD processes are also defined in ISO documents, including ISO 13407 and the associated technical report, ISO TR 18529. Increasingly, UCD has become part of the cultural vernacular of the executives and managers who drive technology development in companies of all sizes.

In the past, many of the UCD methods in the literature were found ineffective or impractical for a variety of reasons [2, 8]. Nielsen [5] argued that many developers did not use usability engineering techniques because they considered them intimidating in their complexity, too time consuming, and too expensive to implement. However, the growing popularity of e-commerce has greatly bolstered the appeal of usability and UCD, as users can take their business elsewhere with just one mouse click. It is cited that poorly constructed sites can cause half of their visitors to go elsewhere [1]. UCD is frequently prescribed and adopted as the key to user-friendly Web design [4]. Therefore, it is of critical importance to assess the state of the art and to boost the practice of UCD.

Several surveys have been conducted recently on UCD practice. For example, Rosenbaum et al. focused on the contribution of organizational approaches and UCD methods to strategic usability [7]. They found that major obstacles to creating greater strategic impact included resource constraints, development and management doubts about the value of UCD or usability engineering, and deficiency in usability knowledge. Hudson and Bevan found that informal and less structured methods, such as informal usability testing, user analysis or profiling, low-fidelity prototyping, and heuristic evaluation, tended to be used much more widely than more formal and highly structured methods [3].

Our survey had several unique features. Its primary focus was on the organizational impact and practice of UCD, including measures of UCD effectiveness, the profile of a typical project, UCD staff and organization, and a representative UCD process, in addition to the most commonly used UCD methods. Results provided useful empirical data to facilitate UCD planning, training, and deployment.


The growing popularity of e-commerce has greatly bolstered the appeal of usability and UCD, as users can take their business elsewhere with just one mouse click.


We surveyed experienced practitioners of UCD, those who had at least three years of experience and considered UCD as their primary job. They were identified from two separate pools of professionals, the ACM CHI conference attendees and Usability Professional Association (UPA) members. The two mailing lists together had over 3,000 deliverable addresses with some overlap. A working definition of UCD was given at the beginning of the questionnaire as follows: "UCD is herein considered, in a broad sense, the practice of the following principles: the active involvement of users for a clear understanding of user and task requirements, iterative design and evaluation, and a multi-disciplinary approach. UCD methods are modular or identifiable processes involved in UCD practice. You should NOT think of UCD as merely usability testing or software engineering."

We received 103 completed questionnaires from respondents in the U.S. (60%) and Europe. All reported familiarity with UCD, and had participated in an average of five projects involving UCD in the past year, with five also being the most common number of projects. In addition, 84% of them ranked their level of UCD expertise highly, giving themselves a six or seven on a seven-point scale. Whereas such a high level of self-rated UCD expertise is desirable, it is difficult to accurately assess it or even to know for sure what it means to hold expertise in a discipline such as UCD.

Respondents were asked to describe a representative project involving UCD in which they had participated in the past year. Most commonly, 10% of the overall project budget was spent on UCD, with an equal number of projects spending more and less than 10%, as indicated by the mode and median (see Table 1). On average, over 19% of the total budget was spent on UCD. However, given the diversity of projects as reflected in the large standard deviation score (25%), the average is less meaningful than the mode and the median. The percentage of UCD personnel is near 20% by various measures. The median sum allotted to UCD per project was $40,000, which is obtained by first calculating the UCD expenditure for each project based on the project budget and estimated UCD percentage, and then identifying the median.

However, care must be taken in interpreting the UCD budget numbers and other findings we present here. They are the perceptions of UCD experts, who might not have the hard facts. In this case, 40% of the respondents did not even answer the two questions on UCD budget for either lack of knowledge or confidentiality concerns. Furthermore, some respondents might not be in the best position to provide such financial information.

Regarding the perceived impact of UCD, 72% of the respondents reported that UCD methods had made a significant impact on product development in their organizations, by indicating five or higher on a seven-point scale. The overwhelming majority said UCD methods had improved the usefulness and usability of products developed in their organizations, 79% and 82% respectively. Clearly, there was a consensus that UCD had made a difference.

However, the degree of UCD adoption was uneven, as reflected in a relatively low mean (4.44 out of 7) and a large standard deviation (1.99). Moreover, 32% of the respondents were not sure if UCD methods had helped save product development costs. Among those with a definitive opinion, more people believed that UCD methods actually saved product development costs than those who thought the opposite (44% versus 24%). A nearly identical pattern holds for product development time.

It is somewhat surprising that many practitioners believed UCD methods did not save time, although the underlying assumption of UCD is that, in the long run, applying UCD saves development time and money by reducing the amount of rework needed. Perhaps respondents focused only on development time and cost for a given release and did not look at the big picture including service cost and redesign. The concern is that if some UCD experts were skeptical of the value for time and cost saving, the development team and non-believers would be even more so. However, this finding is not particularly surprising in the context of other results presented later—such as the absence of common measures of effectiveness, and the narrowly focused and incomplete manner that UCD was often practiced.

Respondents were asked to describe a few quantitative and qualitative measures of UCD effectiveness in their organizations, such as market share or sales volumes growth, product usability measures, and increased user satisfaction. The response was idiosyncratic and sparse, as shown in Table 2. The 103 respondents mentioned a total of 191 indicators of UCD effectiveness with little consensus. Fifteen individuals reported that no effectiveness measure was in place. Results were scattered in 16 different categories. Only seven indicators were reported by more than 10% of the respondents, and aside from external (customer) satisfaction, none were mentioned by more than 20% of the respondents. The rest were rarely used (not shown in Table 2 if mentioned less than five times).

In the past, the lack of widespread UCD implementation was thought to be partly due to two common misconceptions: usability could not be measured, and development work did not have usability goals [2]. Our result shows little improvement in measuring UCD effectiveness, which could seriously hinder the wider adoption of UCD. On the other hand, it is important to note the advent of the Web as a customer communication and transactional channel has significantly bolstered the case for the measure of usability and UCD outcomes [1, 4].

Respondents were also asked to identify several of the most commonly used UCD methods in their practice. They were also asked to rank the five most important UCD methods "on the basis of their actual impact on product development (including user satisfaction, results in the market, and cost savings)." Thirteen distinct categories emerged, as shown in Table 3. Results show that informal, low-cost methods were more widely used, which is consistent with Hudson's informal survey [3]. However, our findings go further by revealing UCD practitioners' beliefs about the practical importance of various methods.

Several interesting observations can be made from Table 3. For example, five of the UCD methods were considered commonly used, as they were mentioned by at least 28% of the respondents (italicized in Table 3). They are iterative design, usability evaluation, task analysis, informal expert review, and field studies. With the exception of informal expert review, these methods had the largest impact in practice, as reflected in the mean importance score. In other words, informal expert review was widely used (likely because of its low cost), but was not considered to have high impact. It is interesting to note that field studies (including contextual inquiry) and user requirements analysis were considered most important in practice, but were not widely used. It appears that respondents were mindful of a strong cost-benefit trade-off in their evaluation of various UCD methods.

Respondents were asked to describe a typical UCD process in free-text form. The responses were compared to a published representative end-to-end UCD process [9]. One surprise is that, given the widespread endorsement among practitioners of applying UCD to the total user experience (everything the user sees or touches), it was not referenced even once. There is further evidence that task analysis, iterative prototyping, and heuristic evaluations were used widely. However, the majority of respondents referred exclusively to UCD for the user interface narrowly defined (for example, GUIs).


Our respondents believed that UCD would likely achieve even wider use and greater impact in the next five years. These findings indicate that UCD has already had an impact and is gaining increasing acceptance.


There were many references to user involvement during discovery, design, or development phases, but only 13% of the projects engaged in a full UCD approach in the sense of user involvement at all three stages of the development cycle. Only 5% referenced a multidisciplinary team approach as defined by the involvement of more than three unique disciplines. This finding contrasts with the 86% of those who responded "Yes" when explicitly asked if they considered their representative project to be multidisciplinary. Probably the most accurate depiction of the nature of these teams comes from an analysis of the job titles that were listed for the team. According to the criterion of more than three unique disciplines, only 21% of the teams were multidisciplinary. This raises the question of what makes a team multidisciplinary.

In answers to other questions in the survey, many respondents referred to customer satisfaction as a primary measure they tracked (see Table 2). However, in describing their typical process, there were no references to setting satisfaction targets or comparing user feedback results to them. This suggests that the measurement of customer satisfaction was seen as outside of the UCD process.

A series of hierarchical regression analyses was conducted to examine organizational properties and characteristics of UCD processes as potential factors affecting UCD effectiveness. Involving a multidisciplinary team as an integral part of UCD emerged as a high impact factor. This factor, combined with having a centralized UCD staff, accounted for 16% of the variance in UCD impact. UCD staff in many organizations was centralized (41% of our sample), whereas only 15% of the organizations had completely decentralized UCD staff. Task analysis with user input was found to be third in importance, which is highly correlated with the practice of UCD in all three stages: discovery, design, and development.

Back to Top

Conclusion

A note of caution when interpreting these findings, which are based on perceptions of UCD experts, rather than hard facts. This discussion attempts to integrate the detailed findings in order to present a global picture. First of all, UCD expenditure often exceeds 10% of the overall project budget. Previously, apart from Nielsen's finding of 6% [6], there was little evidence of UCD spending, despite the fact that such information could be an important indicator of UCD practice and useful for project planning and management.

Also important was the finding that UCD is generally considered to have improved product usefulness and usability, although the degree of UCD adoption is quite uneven across different organizations. Furthermore, our respondents strongly believed that UCD would likely achieve even wider use and greater impact in the next five years. These findings clearly indicate that UCD has already had a significant impact and is gaining increasing acceptance across the industry.

However, our survey also raised several concerns with the current practice of UCD. There was a lack of commonly used measures of UCD effectiveness. Our respondents were also somewhat ambivalent about whether UCD had produced savings in development time and costs. Furthermore, some common characteristics of an ideal UCD process were not found in practice, namely focusing on the total user experience, end-to-end user involvement in the development process, and tracking customer satisfaction.

Our survey identified a set of most commonly used UCD methods. Cost-benefit trade-offs seemed to play a major role in the adoption of UCD methods. This agrees with similar findings from other recent surveys. For example, field studies were generally ranked high on perceived practical importance but were relatively infrequently used, likely because they are costly, whereas heuristic evaluations were heavily used because they are relatively easy to perform and less costly.

Also worth noting is that a multidisciplinary approach to UCD appeared to be closely related to perceived UCD effectiveness, although practitioners were not always clear about what constituted multidisciplinary involvement. Moreover, having a centralized organization also emerged as a predictor of the organizational impact of UCD. This finding suggests the need for UCD practitioners to have a home base for their professional development, although it could be beneficial to be close to their product teams in order to be effective.

The findings of this study taken together suggest the need for a new perspective on UCD practice. First, a rigorous end-to-end methodology is not being practiced yet. To be effective, such a methodology should be scalable based on project characteristics. Second, it is critical to include measures of progress as a required component of any UCD program. A program that includes these elements has been developed [9], and several case studies have also been completed to illustrate the success of the approach, along with additional research and methodology enhancements [10, 11].

UCD appears to be making an impact across the industry, and is enthusiastically endorsed by practitioners. However, this survey also raised some concerns about the current practice. Nevertheless, in light of the growing trend of e-commerce and higher demand for product usability, it is expected that UCD will continue its growth and acceptance among corporations and the management within them.

Back to Top

References

1. Ellis, P. and Ellis, S. Measuring user experience. New Architect 6, 2 (2001), 29–31.

2. Gould, J.D., Boies, S.J., and Lewis, C. Making usable, useful, productivity-enhancing computer applications. Commun. ACM 34, 1 (Jan. 1991), 75–85.

3. Hudson, W. User-Centered Survey Results, email posting to [email protected], May 3, 2000.

4. Knobel, C. Leveraging usability to maximize your Web site. AICPA Infortech Update 11, 1, (Jan./Feb. 2002), 4–7.

5. Nielsen, J. Using discount usability engineering to penetrate the intimidation barrier. In R.G. Bias and D.J. Mayhew, Eds. Cost-Justifying Usability. Academic Press, 1994.

6. Nielsen, J. Usability Engineering. AP Professional, 1993.

7. Rosenbaum, S., Rohn, J.A., and Humburg, J. A toolkit for strategic usability: Results from workshops, panels, and surveys. In Proceedings of CHI'2000 (Amsterdam, 2000), 337–344.

8. Vredenburg, K.. and Butler, M.B. Current practice and future directions in user-centered design. In Proceedings of the Usability Professionals' Association Fifth Annual Conference, 1996.

9. Vredenburg, K., Isensee, S., and Righi, C. User-Centered Design: An Integrated Approach. Prentice Hall, 2001.

10. Vredenburg, K., Ed. Designing the total user experience at IBM. International Journal of Human-Computer Interaction 14, 2002, 275–558.

11. Vredenburg, K. Building ease of use into the IBM user experience. IBM Systems Journal 42, 4 (2003), 517–531.

Back to Top

Authors

Ji-Ye Mao ([email protected]) is a professor at Renmin University of China, Beijing, P.R. China; he was associated with the University of Waterloo, Waterloo, Canada, when the research for this article was conducted.

Karel Vredenburg ([email protected]) is the program director for IBM Corporate User-Centered Design and User Engineering in Toronto, Canada.

Paul W. Smith ([email protected]) is a research associate at the IBM Centre for Advanced Studies, Toronto, Canada.

Tom Carey ([email protected]) is a professor and the associate vice-president for Learning Resources and Innovation at the University of Waterloo, Waterloo, Canada.

Back to Top

Tables

T1Table 1. Profile of a typical project involving UCD.

T2Table 2. Top 10 cited measures of UCD effectiveness.

T3Table 3. Ranking of importance and frequency of mentioning.

Back to top


©2005 ACM  0001-0782/05/0300  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2005 ACM, Inc.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account
Article Contents: