Alan Turing and John von Neumann pioneered the study of complex systems. In analyzing feedback processes, they were interested in how complex interacting systems can respond to new information. Among other things, they found that some systems with many interactions among highly differentiated parts can produce surprisingly simple, predictable behavior (such as a programmable mechanical routine or process), while others generate behavior that may be impossible to predict, even though these systems feature simple laws and few actors or agents (such as an evolving living organism).
Simon [6] defines a complex system as one comprising a large number of parts that have many interactions. Thompson [8] describes a complex organization as a set of interdependent parts, which together make up a whole that is interdependent with some larger environment. A complex system (organization), as we perceive from these observations, exchanges resources with the environment and consists of interconnected components that work together. A complex system exhibits several major characteristics: a large number of interacting parts; interactive complexity; and self-organization.
A complex system encompasses a large number of interacting parts and has structure and behavior that are difficult to understand and predict. The large number of components creates difficulty in comprehending the structure that binds them. In addition, the nontrivial and complicated interaction among the components creates uncertainty in predicting the emergent behavior of the system. The system is interactively complex and exhibits rapid, unpredictable change with no apparent pattern. Its behavior looks complex and random. As such, the behavior of a chaotic system is unpredictable as it is affected by uncertainties that are beyond long-term contemplation and thus, defies the classical management approach of orderly planning and control. A complex biological organism changes when the conditions are right following an autocatalytic process. In a complex system (organization) the tendency is for self-organization to take place following the autocatalytic process, leading to autonomous, organic (self-steering) organizational units, based on insights and competence of the actors, as well as synergy, flexibility, and teamwork.
In reviewing complex systems, Waldrop [9] observes, "There's no point in imagining that the agents in the system can ever optimize their fitness . . . The space of possibilities is too vast; they have no practical way of finding the optimum." Perrow [5] points out complex systems that are interactively complex (varied components that tend to interact in nonlinear ways) and tightly coupled (lacking spatial, temporal, or other patterns of buffering among the components) will fail. Figure 1 indicates that when these systems also employ dangerous technologies, the failures tend to be catastrophic, even though they tend to occur infrequently. Accordingly, Perrow suggests that no matter how hard we might try, certain kinds of systemsthose that had many nonlinear interactions (interactive complexity) and those that were also tightly coupledare bound to fail eventually.
In this article, we apply and illustrate complex adaptive system (CAS) theory as an innovative conceptualization for understanding the evolution of health care and services delivery systems. In this sense, we will touch on both the core fundamentals of the human care process and some of the interacting components of the overall health system, including interactions among multiple stakeholders such as patients, physicians, vendors, third-party insurers, suppliers and other agents (including the government and nonprofit health agencies), various forms of single and multi-provider health organizations such as hospitals, pharmacies, clinics, health maintenance organizations, preferred provider organizations, and laboratories. Our analysis here will focus on how these individual parts connect and interact to achieve an outcome, followed by discussion of key factors and challenges to be addressed in designing future-oriented health care and services delivery systems.
A CAS is a collection of individual, semiautonomous agents that act in ways that are not always predictable and whose actions seek to maximize some measure of goodness, or fitness, by evolving over time. Agents scan their environment and develop a schema for action. According to Gell-Mann [3], an agent's schema defines how that given agent interacts with other agents surrounding it, as well as how information and resources flow externally. Agents can exchange information and/or resources, in both linear and nonlinear ways.
An agent's schema can undergo three types of change:
One of the most intriguing ideas fundamental to CASs is that these systems evolve to a level of organization through self-adaptation, rather than central control.
Schema can change through mutation, and/or combination with other schema. The changes can be with or without explicit purpose. Two main possible outcomes result from schema changes: the agent becomes more robust and can handle greater variation; or the agent becomes more reliable and performs with greater predictability. Because the agents are allowed to respond to stimuli in many different and fundamentally unpredictable ways, emergent, sometimes surprising behavior, both good and bad, may result. Unexpected behaviors may occur at various levels within the system. For example, an evolving relationship of trust between a patient and a physician is an example of emerging behavior at the microsystem level. The recent flu-shot epidemic in the U.S. is an example of emergence that affects the macrosystem level of care.
One of the most intriguing ideas fundamental to CASs for many people is that these systems evolve to a level of organization through self-adaptation, rather than central control. Self-organization is the key idea in complexity science, for example, there is no central controller for the stock market, the Internet, or the food supply of New York City. Similarly, there is no central controller for the health system as a whole. Yet, what is most important is to understand how the CAS theory applies to everyday activities of health care systems, that is, the more fundamental core of the health care system that differentiates it from all other lines of industry.
Health care is a human-based system and preferences often change in human-oriented systems. Focusing on the most basic aspect of the health care system, we may possibly infer that the large amount of knowledge and skills of multiple stakeholders that is required to maintain or keep the human body in good shape and health is, in and of itself, a notion of complexity. For instance, some might marvel at the number of medical specialists (neurologists, orthopedic surgeons, internal medical specialists, psychiatrists, and sport medicine specialists) and the knowledge and skills of other alternative health care professionals (chiropractors, physiotherapists, massage therapists, acupuncturists, naturalists, and/or other integrative medical practitioners) that may be needed to return someone who has undergone multiple soft-tissue injuries due to an automobile accident back to their original state of health. It is this gap between the intricacies of the human body (in this case, the human nervous and muscular system) and the available knowledge and skills of multiple providers that generates uncertainties and complexity in the rehabilitation care process. It is this complex care process which, in turn, makes it difficult, if not impossible, to plan and standardize the health care intervention processes. Naturally, not all health care treatment processes, however easy or complex, fit into our CAS theoretical model. A total knee replacement procedure, for example, may be viewed as a highly standardized process that does not need to be supported by a flexible strategy and entails self-organization ability of our CAS approach even though it is quite complex.
In contrast to mechanical processes in a programmable machine system, the processes in health care organization systemssuch as hospitalsare also invariably complex. Within the Henry Ford Health System in Detroit, for example, a general physician may be able to order one of thousands of medications, one of hundreds of clinical laboratory tests and radiological procedures, as well as a number of other tests and procedures. Along with changing patient condition and co-morbidity, the sequencing and timing of all these events will ultimately determine the relative utility of a selected approach to patient treatment. The variability within a pathway or guideline is further compounded by the diversity of diseases and problems. There are more than 1,000 diseases, each of which, in theory could have a different pathway or guideline. Compared to any other manufacturing process, for example, the assembly process in the automotive industry, this variability, or opportunity for variability, in health care systems is unparalleled. In the words of one of the reviewers of this article, "No carmaker produces 1,000 different models of cars or provides for each model 2,500 different types of paint, 300 different arrangements of wheels, or 1,100 different locations for the driver's seat."
Chaos implies unpredictability; however, chaos equations do not reveal randomness but instead yield complex patterns.
Human beings do not act as expected or as predicted by the rational expectation theory in complex systems such as health care and services delivery systems. Although an intensive care unit is a very small component in the overall health care system, it is extremely complex and involves a team that may include several surgeons and assistants, anesthesiologists, nursing and support staff, and multiple mechanical and electronic devices. Each system element interacts repeatedly with many of the other elements. Every interaction is influenced by the clinical scenario and each individual's role, training, and personality. The immense complexity of the interactions makes it impossible to accurately predict all aspects of the system's behavior, particularly in unusual or stressful situations. Even years of medical experience are insufficient to predict outcomes in complex systems. Although a cure may be fairly certain it is often less clear whether patients are actually better or truly healed as a result of their treatment. Under conditions of high task complexity, heuristic strategies are likely to be applied by health care providers to simplify the decision problem. As decisions become more complex and the available information leads to potentially ambiguous and contradictory interpretations, health care teams need to reorganize their structure and adjust their strategies with confidence. For example, in a situation of an emergency biological outbreak such as anthrax or SARS, there may be a large number of possible interpretations of health care problems, coupled with a lack of clearly defined standards for choice of action. Constraining decision problems under such circumstances may be extremely difficult and related health care systems would be expected to be able to adjust themselves according to the development of case scenarios.
Health care and services delivery systems consist of multiple providers, and in most cases, electronic and mechanical components and coordinated networks. Teams of providers, managers, and devices attend even to simple outpatient care. Critically ill inpatients are cared for in extraordinarily complex systems with a myriad of human and nonhuman elements. Accordingly, a number of key elements relating to the nature of health care and services delivery systems match the characteristics of CASs:
Health care and services delivery systems are typically large and complex with many interacting components, including patients, doctors, nurses, medical suppliers, health insurance providers, and health care administrators. These systems are often complicated by factors such as individual differences among stakeholders, organizational beliefs and culture, differences in occurrence of epidemic diseases and even economics, more specifically, the availability of resources and wealth of the affected community, region, or country shaping the particular system. Moreover, vaccine shortages, epidemic outbreak, staff shortages, canceled operations, ever-increasing paperwork, and sudden change of regulations are all possible scenarios or events that would add another dimension in the complexity of health care.
Even so, CAS theoretical work is of central importance toward understanding the changing behavioral characteristics of such complex systems as health care and services delivery systems. It may be possible, for example, to incorporate a controlled form of preference change into an agent-based simulation model to investigate the characteristics and possible behavioral outcomes of these complex systems (see Cohen [2] and Axelrod [1]). In a CAS, preference change is influenced by experience, but it is not necessarily conscious. It is for this reason that a CAS is not best understood as the sum of the parts, but rather as a large evolving entity, where the starting conditions, as well as the evolution of the elements of the entity, are important. Health care and services delivery systems represent an area filled with the potential for subconscious preference changes, whether the preferences are that of a patient regarding a physician, a patient regarding an insurance provider, or a physician regarding a health institution.
Chaos implies unpredictability; however, chaos equations do not reveal randomness but instead yield complex patterns (see Figure 2). Chaos theory is an explanation of the behavior of a system that can be described by nonlinear equations where the output of one calculation is taken as the input of the next. After multiple iterations the calculation takes on the characteristics of nonlinearity and becomes specifically unpredictable even though starting conditions may be viewed as a determined pattern. The behavior within the system is paradox in that it defies specific long-term prediction while at the same time demonstrating consistent long-term patterns of organization. Because a chaos system operates in an unstable combination of randomness and order, it continuously changes and evolves: as such, it appears to be an appropriate model for today's complex health care systems.
A complex system can be conceptualized along two continuums: one, along three different stages or degrees of chaos, that is, static, edge of chaos, and chaos; and the other, along three distinguishing levels of analysis as we have discussed in the previous section, that is, human, organization, and system. When a nonlinear feedback system is driven away from the peaceful state of stable equilibrium (static stage) toward the hectic equilibrium of instability (chaos stage), it passes through a phase of bounded instability (edge of chaos) in which it displays highly complex behavior. The table here illustrates various scenarios at the different levels of analysis for which the three dimensions: human, organization, and system, are crossed and differentiated for internal versus external causes of health services chaos. This provides a better understanding of the chaos theory-based perspective on health care and services delivery systems.
Static Stage: The science of complexity is finding punctuated evolution as a characteristic of many complex systems, meaning the static stage usually lasts long periods then is interrupted by short periods of very rapid change. In this stage, the system experiences a period of uniformity (order). However, if the system fails to respond to or recognize changes in the society, it can also expect to suffer. For instance, during the static stage, parents of small children are more likely to neglect the recommended immunization shots that may lead to the outbreak of the next epidemic.
Based on this premise, all organizations can be considered paradoxes, pulled between the stability of certainty and uncertainty arising from a drive for innovation, excitement, and independence. If the organization concedes to the pull of stability it is much less adaptable and is less likely to maximize its potential. Conversely, if the organization moves toward an unstable state it risks becoming disjointed and the prospects of disintegration increase.
The Edge of Chaos: Systems with high degrees of order and stability will generate nothing new, nor will complete chaotic systems. Many complex systems tend to evolve toward the edge of chaos, which is the region between chaotic behavior and stability. The edge of chaos has been described as a constantly shifting battle zone between static and chaos. This is the one place where a complex system can be spontaneous, adaptive, and alive.
Systems on the edge of chaos are pushed randomly into static or chaos by fluctuations inside the systems or by their environments. Negative feedback in these systems returns them to the edge of chaos after the random fluctuations. A clear explanation for the invisible hand that guides evolving systems to the edge of chaos is still being developed. But by looking at this process backward, one can clearly see that only systems exhibiting this tendency will persist as evolving systems. Systems that are too stable will die as the environment evolves; systems that are too chaotic will self-destruct.
The trick of the survival of a complex system is to hover between too rapid, directionless change, and too little change, where unfriendly environmental forces might overtake it. Its best chance occurs at the edge of chaos, that is, the systems' point of maximum fitness and adaptability and their potentially worst scenario. For instance, vaccine shortages, sudden change of health policies, downsizing, and natural disasters may push a system into a chaotic situation. But if the system can approach the edge of chaosin other words, if it can let its autocatalytic forces generate new adaptive skills (such as technological innovations for vaccination and more effective health procedures), it may survive rather than degenerate into complete randomness and decay.
Chaos Stage: Complex systems are potentially unstable. They may evolve so rapidly and in such diverse ways that they self-destruct. In other words, self-organization may lead to self-destruction. An organism's self-organization may produce poor adaptation skills. There is no Darwinian guarantee written into the process. Systems that change in this way enter a stage of chaos. Too far into the chaos, their wild movements and gyrations appear to be completely out of control (or, as organizational theorists would say, they become unmanageable). Their ability to find a niche in the fitness landscape disappears in a flurry of uncontrollable, dizzying oscillations. They have gone over the edge of chaos into the chaos stage.
In the chaos stage, health care catastrophes occur, such as SARS and refugee camp epidemics. The system exhibits rapid, unpredictable change with no apparent pattern. Chaotic behavior looks complex and random. Although the behavior of a chaotic system is unpredictable (such as the spread of SARS in Singapore, China, and in Toronto or the recent widespread Avian flu experienced in Vancouver), there are a limited number of behaviors that a chaotic system can exhibit. Not long ago public health was the science of controlling infectious diseases by identifying the "cause" (an alien organism) and taking steps to remove or contain it. Today's epidemics have fuzzier boundaries ("syndrome X") as they are the end results of the interplay of genetic predisposition, environmental context, and particular life styles. The experience of escalating complexity on a practical and personal level can often lead to frustration and disillusionment.
It is human nature to attempt to control system complexity and uncertainty. Nonetheless, we must ask why we are so determined to control uncertainty for so long. One plausible reason is emotion: people feel secure when they can control situations and predict the future, whereas the unknown creates discomfort. The same applies to organizations; uncertainty causes discomfort, insecurity, and feelings of powerlessness. For society at large, control often means security and power; one who can't control a situation is viewed as unworthy of respect. Such explanations were provided over the years by social scientists explaining why people resist change. This is unfortunate because it is on the boundaries of chaos that most creativity tends to occur; where there's no control, only self-governing parameters or values can be established.
As mentioned previously, turbulent environments are a rule in the real world, not an exception. Thus, the best way to deal with turbulence is not by attempting to control chaotic behavior, but by developing an understanding of its characteristics that allows the possibility of gradually redirecting its natural flow. Complex health care and services delivery systems cannot be entirely controlled, but can potentially be guided by understanding the following principles.
Health care systems are unpredictable, but small initial perturbation can lead to significant changes within the overall system: These systems are too complex to model deterministically, and they may exhibit unintended or unexpected consequences following changes. Therefore, it is wise to make changes incrementally, particularly as starting conditions, to test them on a small scale when possible, and to be alert for unanticipated outcomes during periods of change. Unexpected behaviors may occur at various levels within the system. For example, an evolving relationship of trust between a patient and physician is an example of emerging behavior at the "microsystem" level.
Feedback loops improve performance: To be effective, feedback should be direct, rapid, specific, and constructive. Relaying feedback through a third party, on a written form or after a long delay, is unlikely to be effective. Feedback loops should operate at all times, for good as well as inadequate performances. In one study, for example, trainers teaching communicative gestures to severely impaired individuals discovered that direct, specific, and rapid feedback resulted in increased training accuracy.
Standardization with flexibility maintains care quality in static stage: Standardization should be encouraged for repetitive processes, but avoided for rare processes. It should also be instituted intelligently and input from frontline workers is critical. For example, to standardize the diagnostic workup for a disorder seen only once a year in a given practice would not only be inefficient, but also lead to inappropriate care,
In high-performing systems, frontline personnel typically have great freedom of action when dealing with unexpected situations [4]. Since human physiology, illness, and treatment are infinitely complex, physicians always must act with a high degree of autonomy. Standardized diagnostic or treatment protocols should never limit physicians confronting unusual or unexpected situations. However, it should be emphasized that deviation from standardized care should result in better care, not substandard care.
Shortened response time with backup redundancy improve patient effectiveness at the edge of chaos: Errors will be reduced and efficiency will rise if charts are available more readily, dictations are completed sooner, and phone messages can be delivered more promptly. Shortened cycle time at the edge of chaos in a system's process will improve the efficiency and effectiveness of the system's self-organization.
Moreover, since human errors are inevitable, the failure rate can only be reduced toward zero by some form of redundancy. A small failure rate may be preferable to the administrative burden associated with reducing the failure rate to zero. Therefore, most high-performing systems have a culture that treats individual error as normal and expected. These systems are designed to absorb errors, and/or mitigate their consequences.
Intelligent and effective leadership is essential in the chaos stage: In the chaos stage, intelligent and effective leadership is essential. If feedback loops are absent, someone must take charge of putting them in place and determining whether improvement follows. If system members have divergent aims, someone must take responsibility for identifying common goals and building compromised consensus among dissenters.
Intelligent and effective leadership in health care and services delivery systems may exhibit or have hierarchical features, but it is quite different from what is traditionally considered hierarchical leadership. High-performing systems typically respect their personnel. They respect junior as well as senior team members, solicit their input, and ensure their contributions are valued. In this stage, intelligent and effective leaders lead by example, actively solicit input from frontline workers and supervisors, and create a culture in which excellence and continuous learning and improvement becomes the norm rather than something that has to be forced upon the participants. A fail-safe infrastructure exists that would allow organizational members to tap into the organizational central "intelligence."
A complex system is composed of a network of highly interactive and interdependent elements and agents. The stage of the system's complexity, however, depends on the number of constituted elements or agents, the number of interactions among these elements and agents, and the complexities of the interactions among the elements and agents. Complex systems usually have a large amount of interacting parts that are not predictable over a very long time interval but they do retain information and patterns about their past. Components of the system spontaneously organize themselves into patterns and the system is adaptive to its environment. Health care and services delivery systems are complex systems with many independent agents each interacting with the others, occasionally inducing changes in some, and creating CASs containing emergent properties and outcomes that are potentially unpredictable and cannot be easily managed.
Health care and services delivery systems characterized as CASs display apparently complex behaviors that emerge as a result of often nonlinear and unpredictable interactions among a large number of component systems at both micro (human) and macro levels (organization and system levels). More importantly, these complex, nonlinear, interactive systems have the ability to adapt to a rapidly changing environment. An interesting example of a CAS in today's evolving health care and services delivery systems is the e-public health information systems discussed in Tan [7]. These systems use complex intelligent geographical information systems-related technologies to act as both an epidemiologic surveillance system as well as help prepare the public for any unexpected and uncertain public hazards such as the unpredictable aftermath of biological warfare. Other examples of CASs within today's evolving health care and services delivery systems are e-health systems such as e-home care systems, telemedicine systems, and e-disease management systems. These systems are evolving to meet the needs of the rapidly changing e-health care marketplace and to adapt to new patient-provider environments of currently evolving health care and services delivery systems as described in [7].
Chaos theory can apply to the future of evolving health care and services delivery systems in several ways. A fundamental idea is that these systems need to be able to adapt to new environments, improve the balance between having high degrees of order and stability to become obsolete, and must evolve so rapidly and in such diverse ways as to become self-destructive. In this article, we have explored ways to apply CAS theory to understand, develop, manage, and better design future health care systems. From the perspective of chaos theory, health care systems must abandon any aspirations of controlling their long-term future. While it is not possible to plan, forecast, or control the evolution of health care systems as complex adaptive systems, it is possible to monitor the small changes, even introducing some initial perturbation, if necessary, to study and observe how these changes or related chain of events will result in the short term. It is therefore quite possible to research, manage, and even plan the short-term developments of such systems. In the longer term, however, health care systems should seek to identify any similarities or irregularities in elements of patterns of change and determine acceptable courses of action.
The key challenge to integrating diversified technologies to improve health care is to understand the working of a complex system and to create a spontaneously adaptive and resilient platform or coordinated network of providers geared toward reducing waste, lowering costs, improving quality of life, eliminating unnecessary waiting lines and unforgiving medical errors, allowing greater interdisciplinary collaboration in treating patients burdened with a variety of chronic and acute diseases across settings such as urban hospitals, rural clinics, or even remotely and at home.
1. Axelrod, R.M. The Complexity of Cooperation. Princeton University Press, NJ, 1997.
2. Cohen, M.D. Conflict and complexity: Goal diversity and organizational search effectiveness. Admin. Political Sci. Rev. 78 (1984), 435451.
3. Gell-Mann, M. The Quark and the Jaguar. Freeman and Co., NY, 1994.
4. Kushniruk, A.W. Analysis of complex decision-making processes in health care: Cognitive approaches to health informatics. Journal of Biomedical Informatics 34 (2001), 365376.
5. Perrow, C. Normal Accidents: Living with High Risk Technologies, 2nd ed. Princeton University Press, NJ, 1999.
6. Simon, H.A. The Sciences of the Artificial, 3rd ed. MIT Press, Cambridge, MA, 1996.
7. Tan, J., Ed. E-Health Care Information Systems: An Introduction for Students and Professionals. Jossey-Bass, San Francisco, CA, 2005.
8. Thompson, D. Organizations in Action. McGraw-Hill, NY, 1967.
9. Waldrop, M. Complexity: The Emerging Science at the Edge of Order and Chaos. Simon and Schuster, NY, 1992.
©2005 ACM 0001-0782/05/0500 $5.00
Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.
The Digital Library is published by the Association for Computing Machinery. Copyright © 2005 ACM, Inc.
No entries found