At the International Federation for Information Processing conference last year in Philadelphia, presenters generally agreed that qualitative approaches to information systems research are finally gaining acceptance. Such approaches include grounded theory, ethnography, and case study. At the conference, Lynne Markus of Claremont Graduate University, who for years has advocated qualitative research methods, declared, "We have won the war, let us celebrate." She did not mean, however, that quantitative research, in the form of, say, mathematical modeling, statistical analysis, and laboratory experiments, represents an enemy or is bad research and is now defeated, but that qualitative approaches are now accepted as equal in value to quantitative approaches when used appropriately.
Whether or not an approach is appropriate depends on the research topic and the research questions being addressed. A particular strength of qualitative methods is their value in explaining what goes on in organizations.
Here, we want to celebrate and recommend action research, because this particular qualitative research method is unique in the way it associates research and practice, so research informs practice and practice informs research synergistically.
Action research combines theory and practice (and researchers and practitioners) through change and reflection in an immediate problematic situation within a mutually acceptable ethical framework. Action research is an iterative process involving researchers and practitioners acting together on a particular cycle of activities, including problem diagnosis, action intervention, and reflective learning.
We use information systems as the exemplar of how to benefit from action research methods, though software engineering and systems science, among others, could be used because their application domains also include real organizations. For developing information systems, action research has already made five key contributions:
These efforts all yield observable effects on practice. For example, action research encourages researchers to experiment through intervention and to reflect on the effects of their intervention and the implication of their theories.
Conventional systems analysis approaches, such as structured analysis and data analysis, emphasize the "hard" aspects of the problem domain, that is, the certain and the precise. A hard approach is prescriptive and might be applied fairly consistently between organizations and within organizations. Yet Peter Checkland, a professor at Lancaster University, who is influential among Multiview authors, argues that systems analysts need to apply their craft to problems that are not well defined [5]. Moreover, researchers need to understand the ill-structured, fuzzy world of complex organizations. People are what make organizations so complex and different, and people are far different in nature from data and processes. People have different and conflicting objectives, perceptions, and attitudes. People change over time. And systems analysts have to address the fundamental human aspects of organizations. Failure to include human factors may explain some of the dissatisfaction with conventional information systems development methodologies; they do not address real organizations.
We might view this dilemma through the analogy of two problems. The first is a punctured tire. We know how to deal with a punctured tire, as there is a standard repair process and therefore a clear solution. The second concerns world poverty. The solution is not clear; any approach to addressing the problem is complex, and winning the agreement of all interested parties is quite difficult.
Businesses are nearer the world-poverty problem than they are to the punctured-tire problem. Therefore, an applicable methodology for designing systems can be developed in a professor's office without trying it out in many real-world situations. The professor may have read a lot about the subject; observed a lot of systems development in organizations, building up a series of case studies; and even devised a theory for systems development, but this approach is not enough. In action research, the researcher wants to try out a theory with practitioners in real situations, gain feedback from this experience, modify the theory as a result of this feedback, and try it again. Each iteration of the action research process adds to the theoryin this case a framework for information systems developmentso it is more likely to be appropriate for a variety of situations.
This cycle of action research is exactly what has happened in the development of Multiview. The Multiview framework has been applied in a number of situations (see [2] for six examples). None describes Multiview working perfectly in an organization according to prescription, but all have delivered lessons furthering Multiview development. Multiview includes tools and techniques blended into a common approach, each used on a contingency basis, that is, as appropriate for each problem situation. Even after some 15 years of refinement, Multiview's authors still view it as a framework, not a step-by-step methodology, and its use as an "exploration of information systems development," not a prescriptive approach. Indeed, a new vision statement about Multiview was published in [1] earlier this year. Each new application leads researchers and practitioners to adapt rather than adopt the framework for new situations, but each application yields more lessons.
By emphasizing collaboration between researchers and practitioners, action research would seem to represent an ideal research method for information systems. Such systems represent an applied discipline, and the related research is often justified in terms of its implications for practice. Action research can address complex real-life problems and the immediate concerns of practitioners. Yet, paradoxically, the academic community has almost totally ignored action research. Only one of the 155 articles in a survey of 19 journals in 1991 describing research theories and methods used in information systems covered action research [6].
Another survey, in 1997, included 29 articles on action research, spanning 25 years, 19711995. Using a key-word search, the articles were found in 20 leading journals covering business, education, engineering, health, and public service. However, it found only one article on action research in the four mainstream systems journals surveyed. It also noted a deeper complexity, that is, different types of action research, categorizing them into four types:
This categorization of action research adds not only further complexity but, perhaps, confusion, thus arguing against the wider adoption of action research in general.
Nevertheless, the situation regarding the results of action research may be less bleak. For example, a look at Communications of the ACM finds numerous articles discussing the lessons learned from particular projects, variously described as case studies, systems design, software engineering projects, and more. This stream of work might loosely be classified as of an "action research type," even though the term "action research" is never used in the articles. The value of such articles might be enhanced if the cycle of action research were adhered to and described explicitly in the context of the particular projects being discussed.
In action research, the emphasis is more on what practitioners do than on what they say they do.
Another factor mitigating against the use of action research is that much of it is published in books rather than in articles. Action researchers have large and complicated stories to tell. It is notable that the references for the examples of action research programs mentioned earlierMultiview, the soft systems methodology, the Tavistock School, the Scandinavian approach to participation, and ETHICSwere all published in books and are all European. Action research will not be recognized unless the approach is made explicit in the research literature. We hope we are starting that process. The Web site on qualitative research [8] is an online resource that should help find the literature and spread the word about qualitative research approaches. We hope too that the content on action research will grow to be as large as, say, case study research.
Researchers should be explicit about their approach, clarifying their research aim, theory, and method at the outset and all the way through its application, as well as at the time of its publication [10]. The importance of being explicit about the research method is as true for action research as it is for any other research approach. If researchers are not explicit in following the tenets of action research when working in real-life situations, their work might be better described as consulting. Alternatively, interviewing and observing people in these situations without the insight associated with intervention is also not action research. It might be described instead as case study research. Such research frequently reports what practitioners say they do. In action research, the emphasis is more on what practitioners actually do.
Action researchers should explain their approach and its application, bearing in mind that the research will be evaluated in part by its ability to explain practice; for example, proper documentation of the research process is important. The action researcher may experiment on improving such writings through diaries and concept maps while giving full consideration to the audience being addressed, whether it includes academics or practitioners. Explicit criteria should be defined before performing the research in order to later judge its outcome, as should ways to manage alterations in these criteria as part of the process of problem diagnosis, action intervention, and reflective learning. Otherwise, what is being described might be action (but not research) or research (but not action research).
Another potential problem is that researchers and practitioners working together in this way need to share a mutually acceptable ethical frameworkand is part of our definition of action research. Successful action research is unlikely where there is conflict between researchers and practitioners or among practitioners themselves. For example, problems may well arise if the research could lead to people being fired. This result can conflict with the researchers' principles but be acceptable to practitioners (or vice versa).
Although there are examples of action research articles [3], there is still a lack of detailed guidelines for novice researchers and practitioners to understand and engage in action research studies in terms of design, process, presentation, and criteria for evaluation. Furthermore, there is a need for an action research monograph, similar to [11] on case study methodology, to serve as a comprehensive framework and guide for the larger community. The framework proposed in [6] consists of four dimensions:
This framework is a foundation on which the pedagogy of action research in systems development can be refined and debated, perhaps helping establish a unifying framework in systems development. However, it still has to be supplemented through a comprehensive set of criteria by which action research might be conceived, designed, conducted, presented, and evaluated.
1. Avison, D., Wood-Harper, A., Vidgen, R., and Wood, J. A further exploration into information systems development: The evolution of Multiview2. Information Technology & People 11, 2 (1998), 124139.
2. Avison, D., and Wood-Harper, A. Multiview: An Exploration in Information Systems Development. McGraw-Hill, Maidenhead, U.K., 1990.
3. Baskerville, R., and Wood-Harper, A. A critical perspective on action research as a method for information systems research. J. Inf. Tech. 11, 4 (1996), 235246.
4. Bjerknes, G., Ehn, P., and Kyng, M., Eds. Computers and Democracy. Avebury, Aldershot, U.K., 1987.
5. Checkland, P. Systems Thinking, Systems Practice. Wiley, Chichester, U.K., 1981.
6. Lau, F. A Review of action research in information systems studies. In Information Systems and Qualitative Research, A. Lee, J. Liebenau, and J. DeGross, Eds. Chapman & Hall, London, U.K., 1997, pp. 3168.
7. Mumford, E. Job satisfaction: A method of analysis. In Designing Organizations for Satisfaction and Efficiency, K. Legge and E. Mumford, Eds. Gower Press, Teakfield, U.K., 1978, pp.1835.
8. Myers, M. Qualitative Research in information systems. ISWorldNet Web site (1997); see www.auckland.ac.nz/msis/isworld/.
9. Orlikowski, W., and Baroudi, J. Studying information technology in organizations: Research approaches and assumptions. Information Systems Research 2, 1 (1991), 128.
10. Robey, D. Research commentary: Diversity in information systems researchThreat, promise, and responsibility. Information Systems Research 7, 4 (1997), 400408.
11. Yin, R.K. Case Study Research: Design and Methods. 2nd Ed. Sage Publications, London, U.K., 1994.
©1999 ACM 0002-0782/99/0100 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 1999 ACM, Inc.
No entries found