Ms. Perez was giving a powerpoint presentation to her potential clients in the hope of landing a big contract. She was presenting a new advertising campaign for a mutual fund company and had spent three months with her team on perfecting the proposal. Everything seemed to be going well when suddenly a small window screen popped up informing her that an error had occurred and asked if she would wish to send an error report. She clicked the send button and the application on her laptop shut down, disrupting the flow of her presentation and making her look unprofessional.
This story entails an example of a user's experience and response to a new method for collecting information on software application errors. To maintain a certain level of quality and ensure customer satisfaction, software firms spend approximately 50% to 75% of the total software development cost on debugging, testing, and verification activities. Despite such efforts, it is not uncommon for a software application to contain errors after the final version is released. To better manage the software development process in the long run firms are involving the users in software improvement initiatives by soliciting error information, while they are using the software application. The information collected through an error reporting system (ERS) plays an important role in uncovering bugs and prioritizing future development work. Considering that about 20% of bugs cause 80% of the errors,10 gathering information on application errors can substantially improve software firms' productivity and improve the quality of their products. High quality software applications can benefit the software users individually and also help improve the image of the software community as a whole. Thus, understanding the emerging error reporting systems (ERS) and why users adopt them are important issues that require examination. Such an analysis can help the software companies in learning how to design better ERS and educate the users about ERS and its utilities.
Traditional error reporting methods follow a passive approach, wherein the users are required to directly contact the software company or to transmit error information via a third party vendor (such as CERT/CC, SecurityForcus). CERT/CC acts as a mediator to collect such information and delivers it to the software companies. In general, software firms are given a limited time to fix errors and provide patches to its users before CERT/CC releases the error information to the public domain. Such a situation occasionally pressures the software companies to release their patches prematurely.
A more proactive approach involves the use of ERS. It is a pioneer system that enables the users of software applications to send error related information, when they are using a software application. ERS, also known as Online Crashing Analysis, is a relatively new functionality integrated into many system platforms (such as Windows XP and Vista, Mozilla, and Mac).11 Different platforms use different techniques, which vary in the degree of automation of the ERS. For instance, the ERS used by the GNOME platform requests its users to fully describe the incidents. Other systems have a higher degree of automation, where the ERS automatically collects the information and prompts the user to send the error report.11
Although ERS can increase the potential for companies to under invest in quality control processes, anecdotal evidence suggests that information collected through the ERS has been put to good use. Windows XP SP1 team, for instance, was able to fix approximately 29% of all Windows XP errors based on information collected through the ERS.12 Further, ERS has been instrumental in fixing problems found in beta version of Windows Vista and MS office 2007.12 The release of Windows Vista and Office 2007 was delayed because Microsoft took the inputs from beta users seriously.7,12 The growing popularity of ERS is also visible in new computer products such as Dell desktop and laptop computers. Dell Computer Inc. is using a system similar to ERS to collect software incompatibilities information and information about hardware problems.
ERS properties in Windows XP and Vista can be configured by accessing system properties, clicking on the advanced tab (XP)/ advance system setting (Vista), and then selecting error-reporting tab (XP)/ system startup and recovery settings (Vista). dwwin.exe (an operating system utility) creates a crash report, when an error occurs. Subsequently the utility triggers a pop-up window, prompting the users to send the error report. Windows error reporting system is a part of the Windows Quality Online Service (Winqual) program, which focuses on hardware and software that bears the designed for Windows and .NET connected logos. It is important to recognize that ERS is not the cause of an application error. It is an outcome of the error and a mechanism that allows the user to notify Microsoft regarding the application faults, kernel fault, or unresponsive applications. Participation in the error reporting activities does not immediately solve the error. The information sent through the ERS varies, depending on the embedded function in the ERS. However, Microsoft claims that ERS enables it to solve the problems more efficiently and effectively in the long run.7,10 While using ERS can greatly benefit the software community as a whole, little has been done to investigate user perceptions and behavior towards the ERS.
Prior studies on continuous software development have examined the practice of disclosing error information by third-party organizations (for example, CERT/CC, Security Focus) and its impact on the speed of patch release by software vendors.5 Some recent studies have examined the difference in speed of patch release across open source vendors vs. closed source vendors.1 A critical issue related to the speed of patch release is the quick accumulation of information about the error or deficiency that can lead to the development of the patch. While users are the main source of this information, it is unclear why they report errors through the ERS. ERS, unlike other avenues to collect error information, is a proactive system and its appearance is triggered when the software in use experiences a bug or a malfunction. Understanding user perceptions regarding ERS can help the designers in developing more usable ERS and better promote the ERS in the user community, consequently increasing the likelihood of getting better user participation.
In this study, we categorized factors that influence ERS usage intention into system-design group and human-computer-interaction (HCI) group. Organizing the factors into these two categories allows us to examine the differences across the two groups and later provide meaningful guidelines on the effective promotion and design of the ERS (See Figure 1). We elaborate on the selected factors next.
System-design factors capture design decisions that can be controlled by ERS designers. ERS initiates interaction with the users by sending a pop-up window, indicating that a problem has occurred in the software application. Users consequently have the freedom to choose whether or not to send the error report. While interacting with the ERS, the users may expect to be informed about the type of data that will be submitted (Data Transparency), how the data will be processed, accessed, and transmitted (Process Transparency), and expect feedback on the value of their inputs (Feedback Transparency). ERS designers control the decision on whether to provide such information to the users. Further, the proactive nature of ERS increases the importance of these transparencies.8,9 It is likely an ERS that is configured to be transparent along these factors increases the user's sense of control.9 Having limited or no transparency can undermine user's trust in and response to the ERS. In short, adequate transparency in data, process, and feedback can help promote user's ERS usage intentions.
While design elements are important in explaining how users respond to the ERS, HCI literature indicates that user's cognitive processes also play a critical role. ERS is similar to a solicitation system. The user's decision to respond to the solicitation can be influenced by their reaction to a request for free information, how the request is made, why the request is made, and their assessment of the benefits of responding to the request. We categorized issues related to the HCI group into a) user's motivation,4 b) user's task,2,4 and c) synchronization. User's motivation is driven by the likely benefit of using the ERS. The interaction between a user and the ERS is short and apparently does not provide the users with any instant benefits. However, in the long run, the user can expect software applications with fewer bugs. Such an expectation may serve as a more realistic view of the benefits from using ERS. In term of user's tasks, ERS users may question their role and appropriateness of the approach for requesting information. Value fit ties to an individual's belief that soliciting information on errors is appropriate and they should comply with the request. Similarly, if the users are clear about their role as a contributor to the overall application development process, their likelihood of responding to the ERS can be enhanced. Thus, role clarity and value fit are both important consideration. HCI literature also highlights synchronization between user's task and technology as an important driver. Work on examining timing of occurrence and compatibility issues have been brought forward by studies on the assessment of advertising pop-up window.2,3 Like the advertising pop-up window, ERS window can interrupt the user's on-going activities and is likely to receive a negative response from the users.
The review here brings forward eight factors, categorized into either system-design group or HCI group (See Figure 1). We examine the influence of these factors on user's intention to use the ERS. We collect ERS user's perception survey from 317 respondents. Nine (9) samples were later removed from the data set since they reported no previous exposure to the ERS, rendering a final sample of 308. Considering the sample, caution should be taken in applying the results to computer users outside the US. The sample includes 180 males (57%) and 137 females (43%) with an average age of 25 years. The respondents use the computer for 29 hours per week on average and have an average computer usage experience of 11 years. Majority of respondents (86%) have high speed Internet access. Sixty two percent (62%) of the respondents report that using MS Internet explorer generated the error reports, followed by Microsoft office applications (10% of the respondents). The rest of the sample (28%) report experiencing the error report window while using other applications (such as, Media Players, Netscape, and Windows).
Descriptive statistics reveals that users in general rated system-design factors lower than factors in the HCI group (Table 1). Users seem to be clear about their roles in interacting with the ERS but find the ERS process ambiguous. They also reported that ERS does not provide sufficient feedback. Further, users expect to get an improved software application in the future and perceive ERS solicitation as appropriate but also state that the ERS disrupts their work. Table 1 summarizes the impact (total effect)a of system-design and HCI factors on user's ERS usage intentions.b Of the proposed eight factors, seven factors show a significant total effect on user's ERS usage intentions.
The analysis of total effect indicated that feedback transparency is the only factor that does not influence ERS usage intention. According to the respondents, ERS does not provide sufficient feedback (low mean value) and consequently they may not consider getting feedback as an important factor in making the decision to use the ERS. This result may be attributed to the proactive nature of the ERS. Users generally do not initiate the communication process with the ERS and may not consider getting feedback a realistic expectation. Total effect also provides information about the relative importance of factors in driving user's ERS usage intentions. The results show that users consider process transparency more important than data transparency. Existing ERS enables the users to view the data, but in its current form this information is difficult to understand. Data needs to be presented in a fashion that is comprehensible to the users. Further, user's lack of control over what information is transmitted is another contingency that influences the important of data transparency.
Assessment of factors in the HCI category shows that user's perception regarding the appropriateness of collecting information (value compatibility) through the ERS is the most important consideration, followed by user's perceived clarity of their role in interacting with the ERS. User's perception regarding work interruption negatively impacts intention to use the ERS and ranks third in terms of importance. Work interruption is the only factor that negatively impacts ERS usage intention. Thus, users perceive ERS as a disruptive system that interrupts them and such a characteristic has a detrimental effect on user's intention to respond to the ERS. Other factors, such as work compatibility and expectation of getting a better product in the future were found less important. Overall, factors in the HCI group play a relatively more important role in influencing the user's intention than factors related to the system design group. So, tweaking the design elements of the ERS can help, but promoting ERS as an appropriate system and educating the users on their role in interacting with ERS constitutes a more promising approach.
Table 2 shows a cross-tab analysis of user's frequency of sending an error report and the frequency of their exposure to the ERS. A majority of the respondents (53.6%) stated that they encounter the error report window occasionally, 21.1% receive the window very frequently, and 7.6% receive the window all the time. Respondents were almost uniformly divided in terms of the frequency of which they send an error report. Assessment of the user's exposure and response rate to the ERS indicates that the user responds to the ERS more often, if the ERS window appears more frequently. For instance, 19% of the users who rarely receive the error report said that they send the error report either very frequently or all the time. The percentage increases for the group of users who occasionally receive the ERS to 40%. The percentage continues to increase to 64% and 80% for the group of users who received the ERS very frequently and all the time. Overall, the analysis shows that users that receive the error report more frequently are more likely to respond to the request. It can be argued that, over time, users develop automatic response patterns or reflexive behavior in their interaction with the ERS. They will adhere to a course of action once they make a decision. In clicking the send button, these users may not reflectively assess the issues in the HCI and system design domain but rather interact with the ERS in the form of habit. Further examination of reflexive behavior in the context of ERS offers an interesting avenue for future research in this area.
Qualitative responses to an open ended question that was included in the survey shows that majority of the respondents are more likely to use ERS if their applications continue to experience problems. This result is consistent with our previous discussion on the relationship between frequency of ERS exposure and users' response rate to the ERS. Assurance from the software manufacturers that the problems will be resolved was the second most reported issue. So, users' expectation regarding feedback (feedback transparency) in general may be not important, but they are interested in knowing how their inputs help in improving the software application (specific type of feedback). The third most cited issue was time constraint. While participating in the ERS may take only a few seconds, users might take into consideration the overall process of responding to the ERS, shutting down of the application, and restarting of the application in forming their overall perception about time commitment in reporting the error.
The study offers interesting insights that can guide the promotion and reconfiguration of the ERS. First and foremost, we believe that ERS needs to be better promoted as a system that assists in improving the quality of software applications. If the users perceive it as a system that benefits the overall user community, they will be more inclined to use the ERS. Emphasizing the role of the user in software improvement through a message on ERS pop-up windows can also improve the chances of getting better responses. Further, providing concise information about data collected by the ERS and its processing can alleviate user concerns and improve participation.
Considering that users highlighted work interruption and time commitment as important issues, we believe that delineating the error report from the overall process of application shut-down and restart is critical. Providing the choice to the users to automatically send the error report at the time of installation or aggregate the errors and report the errors all at once at their convenience are interesting options that designers can explore. Recent practice of similar technology also shows that ERS designers should consider relocating ERS windows on the user's screen. Dell Computer Inc., for example, employs similar technology and the ERS window appears on the right bottom corner of the user's screen. Further, ERS designers should consider ways to minimize the redundancy of the error reports. Reoccurrence of errors can undermine user's trust in the utility of the ERS. ERS should be configured to recognize errors and limit the generation of an error report, if the user has already responded to the same error on the same application. We summarized our recommendations and guidelines for ERS users, ERS designers, and software firms in Table 3.
Our study offers a preliminary examination of ERS from a user's perceptive and there are many other aspects of ERS that call for further research. For instance, prior ERS usage experience, user's disposition towards the software firm, whether/how the system-design factors effect the HCI factors, and the significance of the software applications to users' job performance are promising topics in this area. Formally examining the influence of the qualitative factors is another interesting issue that needs to be assessed. Examining the variation in the user's error reporting behavior and its drivers across open source vs. closed source software applications also requires attention. Error reporting behavior may as well be impacted by the difference in cultural association of users. For example, users from different countries may have different perceptions regarding the ERS and ERS behavior. Finally, it will be interesting to assess the conditions in which cognitive process drive users interaction with the ERS as opposed to reflexive responses. We hope that the avenues for future research outlined in the study provide an impetus to further research on the ERS.
It is important to note that the recommendations from this study are based on data collected from the users of current ERS technologies. Future studies can pursue alternative perspectives. For example, understanding the impact of ERS on quality control processes within the software firms is an interesting area. Overall, ERS is a beneficial technology that can help in improving software quality, but its implementation can also encourage companies to compromise on the comprehensiveness of software testing. Further, as with any other information system, understanding development issues and the designer's view is important. Thus, future works on costs and benefits associated with ERS configuration and promotion are encouraged. Research on these topics along with the results outlined in this study can provide a more complete picture and more comprehensive guidelines on prioritizing promotion and reconfiguration initiatives related to the ERS. Like other information systems, the success of ERS will be defined by its user acceptance and how software firms use/abuse it to their short-term and/or long term advantages.
1. Arora, A. Krishnan, R., Telang, R., and Yang, Y. An empirical analysis of software vendors' patching behavior: Impact of vulnerability disclosure. In Proceedings of the 27th International Conference on Information Systems, (Milwaukee, WI), (2006). 307322.
2. Cho, C.H. and Cheon, J.J. Why do people avoid advertising on the Internet. J. of Advertising 33, 4, (2004), 8997.
3. Edwards, S. M., Lee, J. H., and Li, H. Forced exposure and psychological reactance: Antecedents and consequences of the perceived intrusiveness of pop-up ads. J. of Advertising 31, 3, (Fall 2002), 8335.
4. Karat, J. Karat, C.M. and Ulelson, J. Affordances, motivations, and the design of user interfaces, Comm. ACM 43, 8, (2000), 4951.
5. Kannan K. and Telang, R. Market for software vulnerabilities? Think again, Management Sciences 51, 5, (2005), 726740.
6. Lohr, S. Microsoft software updates go on sale to business, The New York Times, (Dec. 1, 2006).
7. Microsoft Corps; http://msdn.microsoft.com/library/default.asp?url=/library/en-us/debug/base/windows_error_reporting.asp, (2006).
8. Nambisan, S. Designing virtual customer environments for new product development: Toward a theory. Academy of Management Review 27, 3, (2002), 392413
9. Porra, J. Colonial System. Information Systems Research 10, 1, (1999), 3869.
10. Sayer, P. Microsoft Still Bugged by Software Problems: Office XP's error reporting technology is helping the software giant clean its code, but more work is needed; http://www.pcworld.com/news/article/0,aid,105599,00.asp (2002).
11. Wikipedia, Crash Reporter; http://en.wikipedia.org/wiki/Crash_reporter, (2006).
12. Wikipedia, Windows Error Reporting; http://en.wikipedia.org/wiki/Windows_Error_Reporting, (2007).
a. Total effect (a sum of direct and indirect effect) is a measure of overall influence of a variable on another. A higher absolute value of total effect depicts a stronger influence.
b. SEM (structural equation modeling) was used to compute and test the total effect of user's perception on their intention to send the ERS.
DOI: http://doi.acm.org/10.1145/1562164.1562194
Table 1. Relative Importance of Each Contributing Factor on User's Intention to Send an Error Report
Table 2. Cross-tab Analysis of Users Responses to the ERS
Table 3. Guidelines to promote user's participation in the ERS
©2009 ACM 0001-0782/09/0900 $10.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2009 ACM, Inc.
No entries found