By Jacob Buur, Kirsten Bagger
Communications of the ACM,
May 1999,
Vol. 42 No. 5, Pages 63-66
10.1145/301353.301417
Comments
Usability testing has now become a well-known method in product development for examining new products before they are put in the hands of users. In practice, usability testing is often seen as a way to obtain objective data on the use of products for R&D departments. Due to this conception of objectivity (which stems from the tradition of psychological experiments), it has been a widespread belief that usability testing must be carried out by "neutral" usability specialists, that the user must think aloud in undisturbed solitude in the lab, and that R&D experts should be kept at a distance so as not to intimidate the user and color the results [5].
We have come to see these conceptions of usability testing as limited in scope, however useful they have been for providing data on usability problems and for generating usability awareness in the corporate setting. We have learned that traditional usability testing restricts understanding of problems and impedes productive dialogue between designers and users on use, context, and technology [1].
In this article we outline four examples to show how we have turned the conventional usability-testing format into a dialogue between users and designers.
Back to Top
Usability Testing at Danfoss
Danfoss is Denmark's largest manufacturing company, with 18,000 employees worldwide. We produce components such as valves, sensors and controllers for refrigeration and heating systems, hydraulic machinery, and the like. Our products are mechanical components and electronic controllers with "solid user interfaces" [2]. User interfaces are typically composed of a small liquid crystal display and a set of designated push buttons. Danfoss products are operated by such professionals as electricians, refrigeration mechanics, and plumbers. In 1990, man-machine interaction was made one of Danfoss's six core competencies, and activities were established in 1991 to improve the man-machine interaction (MMI) levelin line with those of comparable companies [9]. A year later, we established our first usability lab, which now has expanded to three lab facilities. We have successfully adopted methods from the HCI communityamong these, usability testingto improve the usability of our products, even though our products are not computer applications [4].
Back to Top
Five Steps on Our Path to User Dialogue
Based on four cases, we will explain how our understanding of usability testing has emerged. To simplify matters, we will discuss five steps, even though the changes did not occur in this exact sequence.
- Moving the test facilitator into the lab. Traditional usability testing creates an artificial situation, with tension and nervousness on the part of the user. We realized we could relieve the atmosphere somewhat by making the test facilitator an active, attentive dialogue partner for the user. This meant moving facilitators into the lab alongside the users, rather than hiding them behind a one-way mirror, communicating only by intercom [8].
- Developing video documentation procedures. In usability testing literature, video is suggested as a tool for observing user activity and for communicating findings to the R&D organization ("highlight tapes") [5]. This was how we employed video in early usability testing. However, we also learned the value of video for convincing management that the company was, in fact, not very good at designing easy-to-use products and that user-centered methods were paramount for success. Presenting the material to uninvolved staff introduced new requirements for video quality: we had to make sure that users were shown close-up in order to allow viewers to identify with them; audio and video quality had to be clear.
- Training R&D staff to act as co-organizers. There is a widespread myth in usability testing that designers cannot be allowed anywhere near the users for fear of intimidating them with their preoccupation with their "pet" solution. On the contrary, we found that it seems much more crucial to deal with conveying test findings to the design team than with the notion of objectivity in the test. Our solution to this dilemma has been to educate R&D staff to act as test facilitators and observers. It is difficult for the designers to learn to "shut up and listen," but this is crucial if a company wants to move toward customer orientation.
We learned the value of video for convincing management that the company was, in fact, not very good at designing easy-to-use products.
We do this by running a training session a few days in advance, often with users from inside the company. At the end of the session, we ask the users to give their opinion about the processwhen they became frustrated and why. Afterward we use the video recording to recall communication problems between facilitators and users, and we discuss how to improve.
- Turning test sessions into workshops. In sessions with a single user, the user is not likely to volunteer much information beyond answers to the facilitator's questions. This changes when users are invited to work in pairs. The traditional think-aloud format changes into "co-discovery learning," and the facilitator can concentrate on listening and observing.
When expanding further to, say, eight users at a time, users are sufficiently confident to allow us to bring not just one but several members of the design team into close contact with them. We invite the users to full-day workshops in which they don't simply test products, but participate in design discussions with the designers [7]. Our role changes from that of test facilitator into one of organizing a meaningful discussion between users and designers. Rather than evaluate results for the designers, we together with them use the video recordings later to recall significant snippets of user dialogue as a basis for discussing improvements to the design.
- Involving users in design. Another popular myth we have done away with is that "users cannot design for you." In line with Ehn [6] and others, we have found that users are a potential source of ideas, and that they readily come forth with opinions on design, provided that we can give them the media through which they can express and explore their ideas.
We increasingly allow users to formulate their own use scenarios, rather than dictating specific test scenarios invented from our limited knowledge of the users' world. To engage users in this process, we ask them to recall specific work situations they have experienced in their daily work livesthe type of equipment they worked with, what they intended, what they did, and so forth. Based on these stories, we build test scenarios in collaboration with the users, and try to go through the same actions with the new prototype.
Naturally, this makes heavy demands on the flexibility of the prototypes we employ. We therefore favor the use of low-fidelity paper prototypes, rather than computer simulations of user interfaces, to "buy information from the users" in the early design phases [3].
Today, we often join project teams in the product divisions of Danfoss as user interface designers, employing user participation throughout the process. So, in fact, the facilitator who steps between users and designers has by and large disappeared. Organizing user workshops has become a well-accepted activity, like organizing customer visits or planning design team seminars.
Based upon our experiences at Danfoss, we believe that three important problems can be overcome by turning usability testing into a dialogue with users. First, dialogue facilitates the disclosure of user priorities and practices that may otherwise remain concealed. Second, the problem of anchoring insights gained in the test setting in the R&D departments is easier to overcome if designers themselves engage in dialogue with users. And third, engaging the users in dialogue sessions enables us to move beyond product critique to a more innovative engagement in new design possibilities.
Our next step will be to move the design dialogue into the users' worldthe plant or the shop. By doing this, we hope to engage more of the users' tacit knowledge, and to make the users more confident with their new role in the design process.
Back to Top
References
1. Binder, T. Designing for workplace learning. AI & Society, 9 Springer, 1995, 218243.
2. Black, A. and Buur, J. GUIs and SUIs: More of the same or something different? Info. Des. J. 8, 9, Elsevier Science, 1996.
3. Buur, J. and Andreasen, M. M: Design models in mechatronic product development. Design Studies 10, 3, 1989.
4. Buur, J. and Nielsen, P. Design for Usability: Adopting HCI methods for the design of mechanical products. Int. Conf. on Engineering Design (Prague), Heurista, 1995.
5. Dumas, J. and Redish, J. A Practical Guide to Usability Testing. Ablex Publishing Corporation, 1993.
6. Ehn, P. Work-oriented Design of Computer Artifacts. Arbetslivcentrum, 1988.
7. Kyng, M. and Greenbaum, J. Design at Work: Cooperative Design of Computer Systems. Lawrence Erlbaum Associations, 1991.
8. Rubin, J. Handbook of Usability Testing. John Wiley & Sons, 1994.
9. Wiklund, M. Usability in Practice. AP Professional, 1994.
Back to Top
Authors
Jacob Buur ([email protected]) is manager of the UCD Group of Danfoss in Denmark.
Kirsten Bagger ([email protected]) is a human interaction designer with the UCD Group of Danfoss in Denmark.
Back to Top
Sidebar: Case Study: Usability Testing by the Book
Refrigeration management software (1993)
Two supermarket managers were each invited to evaluate a prototype of a PC-based software system. The setup was as recommended in U.S. literature [5]: the test scenario was given by us; the users were asked to think aloud; the lab was made to resemble an office; and the test was recorded using a remote-controlled camera and scan-converter. The only deviation from the conventional format was that the test facilitator was inside the lab.
The highlights video from this test wasalong with other videosused at an R&D directors' meeting to gain support for the corporate usability effort.
Back to Top
Sidebar: Can Everyone Become a Test Facilitator?
Radiator thermostat (1995)
Four plumbers each installed a new radiator thermostat prototype. The setup was a simulated wall in a meeting room. The test facilitator was a marketing employee from the design group. A design colleague logged the event. After each test, the designer entered the lab for a discussion of design improvements.
To train the two design team members for the event, we arranged a pilot test with an internal technician participating as user. During the session we logged incidents of inappropriate communication between facilitator and user. Afterwards, we used the video recording as a starting point for discussing improvements to facilitator behavior: How do you make the user comfortable? How do you ask the user to continuously think aloud?
At one point, the test facilitator unknowingly embarrassed one participant because of his own frustration with a design flaw in the product. This incident initiated a fundamental discussion in our usability group: Can we expect every designerwith proper trainingto act as test facilitator? Or does it require special human qualities? What if the design team doesn't include naturally talented facilitators?
Back to Top
Sidebar: Case Study: Observer or Participant?
Frequency inverter (1996)
Four users worked in pairs to evaluate the new user interface for a frequency inverter. The first two were service technicians from machine-making companies; the other two were daily users employed at heating plants. The users established their own task scenario during the test, based on their personal experiences.
In this project, our usability group was responsible for the interface design. One designer operated a paper prototype and acted as the facilitator. A second designer logged the event and had ongoing design discussions with members of the product design team in the adjoining observation room. Toward the end of the session, the second designer would enter the lab to join a design discussion with the user, employing Post-it notes to record ideas.
Since the facilitators were usability staff, no training was arranged. In this session, the entrance of the second designer created some uncertainty on the part of the users. They were suddenly reminded that someone had been watching them on video all the time. And to talk freely with the newcomer seemed a bit awkward: The users didn't know him, though he believed he knew them quite well from watching the video.
Back to Top
Sidebar: To: Jacob Buur and Kirsten Bagger (Danfoss)
From: Mary Czerwinski and Michael Muller (Microsoft)
We very much liked the idea of moving the traditional style of usability engineering away from the sterile laboratory environment and toward treating users as design partners. The idea of framing the usability and design research in the actual work the target end users perform was especially appealing. It appears that the model of treating users as participants in the design process has been very successful for Danfoss. We have also seen the value in this model, and appreciate the richness that this process affords. We would like to suggest that Danfoss consider rounding out its participatory design methods with more formal laboratory methods when warranted, taking advantage of the real-world scenarios their fieldwork has provided. We believe the two methods (and any in between) can be very effective in combination for capturing a broader range of usability issues and design ideas to the greatest benefit of the end user.
Back to Top
Sidebar: Case Study: Preparing Users for a Design Dialogue
Magnetic flowmeter (1996)
Six technicians from different industries (dairy, water supply plant, to name two) were invited to an all-day workshop on flowmeter usability. Also participating were six members of the Danfoss development teamthree from the usability group and three from the product division. In the beginning, the users worked in pairs in an application lab, mounting and adjusting the existing flowmeter. Here, the designers operated the video cameras and acted as facilitators. Later, the full user group discussed design ideas, based on competitors' products and the design team's mock-ups and sketches.
This workshop finally put an end to usability testing in the traditional single-user format at Danfoss. Rather than thinking about the session as testing, we started seeing the hands-on activities as a way of preparing users for an informed dialogue on designs, with the development team.
©1999 ACM 0002-0782/99/0500 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.
No entries found