acm-header
Sign In

Communications of the ACM

The profession of IT

Flatlined


Software engineers have long been divided by disagreements over the best approach to designing and developing software, especially for safety-critical applications. Is this a software crisis or simply a normal evolution of a complex technology? Should design be human-centered or technology-centered? Should design processes be agile or plan-directed? Should software engineering be part of computer science or its own discipline? Should software research seek to advance technology or to fulfill utilitarian needs? Will the solution to the software problem come from a technological silver bullet or from careful cultivation of professional software developers?

These controversies all involve thinking with dualisms, opposites at the two ends of a linear scale. Dualistic thinking creates false dichotomies that often hinder our ability to find solutions to problems. As long as we cling to a single dimension, we will be like creatures in Flatland who see only their own space and do not realize that reality has other dimensions.

The rhetorical form of dualisms is this: A dualism posits opposing positions, or poles, P and Q, and the issue is called "P vs. Q." A set of hybrids, representing mixtures of P and Q, can be visualized as points on a straight line connecting the poles. The protagonists hope to compromise at a point on this line. (Often, however, they do not.)

One of the most powerful moves we can make in our thinking is to frame the problem in the right way—ask the right question. The problem's definition implies a space of possibilities that bounds answers and solutions. Thinking with linear scales between dual poles often significantly degrades our space of possible solutions. In addition to the software dualisms just noted, we face dualisms on other issues that concern us as professionals. Prime examples are:

  • Research: Basic vs. applied
  • Design: Technology-centered vs. human-centered
  • Education: Technical skills (hard) vs. value skills (soft)
  • Professional: Technical worker vs. customer-service worker

A simple change of thinking can often reveal significant new possibilities. The change is to frame the problem in a two-dimensional space in which the dualism's former poles are different axes. The following narrative continues and extends a discussion that began in the "Inside Risks" column [2].

Back to Top

Research

For over half a century we have classified research on a scale from basic to applied. Basic research is a quest for fundamental understanding without regard to potential utility. Applied research is technology development that solves near-term problems. These two models have different diffusion times from research result to practice—often 20–50 years for basic research and 2–5 years for applied research. Because the return on investment of basic research is so far in the future, the federal government is the main sponsor; university faculties are the main investigators.


Thinking with linear scales between dual poles often significantly degrades our space of possible solutions.


In 1997, Donald Stokes put the research issue into a new light [7]. He traced the conceptual problem back to Vannevar Bush, who in 1945 coined the term "basic research," characterized it as the pacemaker of technological process, and claimed that in mixed settings applied research will eventually drive out basic research. Bush thus put the goals of understanding and use into opposition, a belief that is at odds with the actual experience of science. Stokes proposes we examine research in two dimensions, not one. The two dimensions are:

  • Quest for fundamental understanding; and
  • Inspired by considerations of use.

As shown in Figure 1, he names the (high, high) quadrant Pasteur's, the (high, low) quadrant Bohr's, and the (low, high) quadrant Edison's. These designations reflect the well-known research styles and public statements of these three great men. They also represent important kinds of research that are all essential to the advancement of science and technology. Stokes did not name the (low, low) quadrant, which some will recognize as the home of junk science and of investigations lacking a search for innovation or new knowledge.

The terms "basic" and "applied" are artificial distinctions. Those who favor applied research call for greater emphasis on Pasteur-Edison quadrants, and those who favor basic, on Bohr-Pasteur. Pasteur is on both sides of such a dichotomy. Most of the basic-versus-applied protagonists will, if shown Figure 1, agree these three correspond to vital sectors of research, each with its own set of concerns and customers willing to pay for investigations.

The emphasis put on a quadrant depends on what comes out of the research and the investment going in. The Edison and Pasteur quadrants are oriented on value creation for customers who now exist and can be identified. In the Bohr quadrant, potential customers and value for investment are less clear because of the long time lines. The Japanese have long "mined" Bohr-quadrant research by patenting possible inventions based on the research and holding the patents until Edison or Pasteur quadrant opportunities materialize.

An important style of computing research is experimental computer science. Its investigators are concerned with building and testing experimental systems and hypotheses. They use experiments either to evolve the best design for the system, or to test hypotheses in other domains using the system as an apparatus. In 1996, the National Academy of Engineering expressed its concern over the difficulties in promotion faced by experimental computer scientists, a problem persisting today. The real difficulty is that experimental research is a better fit with the Edison and Pasteur quadrants than the Bohr quadrant, and the Bohr quadrant is heavily favored in academic circles. Most academics prefer the federal government as their customer rather than business people or consumers.

Back to Top

Software Development

For over a generation we have classified software development on a scale from technology-centered to human-centered. Technology-centered development focuses on advancing software technology with new functions, algorithms, protocols, and development processes. Human-centered development focuses on making software useful and dependable to those paying for or using it.

The belief behind this dichotomy is that designers distribute their attention within a zero-sum game. The more attention on the technology itself, the less on users, and vice versa. Michael Dertouzos devoted his final book to debunking this belief [4]. He documented 15 chronic design flaws in software and said they will be eliminated only when we learn to design software that serves people and does not debase or subvert them. He called for his fellow academics to teach human-centered design and not to scorn software developers who seek to satisfy their customers. Some critics, their minds stuck in the habit of linear thinking, incorrectly concluded that he therefore also supported reducing attention to the engineering of software technology.

Barry Boehm sought recently to soothe the controversy raging between proponents of "agile" software development and of "planned" development [1]. Agile software methods, such as Extreme Programming (XP), cater to programmers who interact closely with their customers in order to avoid wasteful misunderstandings and to adapt quickly to changing requirements. Traditional planned software methods, such as Milestone Programming (MP), cater to programmers who manage software projects with careful documentation of all requirements and design decisions, and with carefully planned tests, in order to provide a high level of assurance in the safety and dependability of their systems. Boehm lists four dualisms that divide these two camps:

  • Individuals and interactions versus processes and tools;
  • Working software versus comprehensive documentation;
  • Customer collaboration versus contract negotiation; and
  • Responding to change versus following a plan.

He argues that these are false dichotomies. He demonstrates there is a sector of software development in which XP is the more appropriate, a different sector in which MP is the more appropriate, and a third sector in which both are useful together. Boehm aims to escape from the dualisms without explicitly saying so.

Here is a two-dimensional interpretation of Boehm's escape plan. Consider two dimensions of software development:

  • Technical excellence, which focuses on technology that exemplifies the best in its functional approach and engineering design process; and
  • Customer satisfaction, which focuses on satisfying real customers.

As shown in Figure 2, these two dimensions define four quadrants that each represent important existing groups of software developers:

  • (high, high)—People who know the technology well and focus on deploying it to satisfy real (paying) customers. Examples: agile programmers, rapid prototypers, requirements analysts, and technical project managers.
  • (low, high)—People who focus primarily on helping real customers, but who rely on technical experts for deep knowledge of the technology. Examples: help-desk clerks, customer support agents, and sales people.
  • (high, low)—People who develop software that meets high technical standards following best engineering practice and working from requirements developed by others. Examples: milestone programmers, and most software engineers.
  • (low, low)—People who hack code without concern for customers or for learning the best of the technology. Examples: Code hacks, drudge programmers, and technical go-fers.

The two-dimensional space accomplishes Boehm's goal of showing the relative positions of the agile programmers and the milestone programmers without favoring one over the other. What is more interesting, however, is that the two-dimensional space shows places for other important members of the software community, including requirements analysts, project managers, customer support agents, and coding hacks.

An important challenge is to free software engineers from the de facto restriction to the high technical excellence, low customer satisfaction quadrant. This is really the challenge of the leader-professional, considered in Figure 2.

Back to Top

Work Force

The distinction between blue-collar worker (labor) and white-collar worker (management) dates back to the 19th century. Through the 20th century, the number of workers who were hired for their muscle power steadily declined in relation to the number hired for their brain power. By 1970, the white-collar group constituted three-fourths of the work force. Peter Drucker referred to them as knowledge workers [5] and Alvin Toffler speculated about the information society they were building [8]. Today, knowledge workers are people with considerable theoretical knowledge and learning: doctors, lawyers, teachers, accountants, economists, engineers, and computer scientists.

Some of the old distinction between the white- and the blue-collar worker survives today under the dichotomy education vs. training. Almost all knowledge workers have a college education based on a curriculum emphasizing theoretical knowledge and principles. Colleges leave it to employers and private services to teach graduates the hands-on value skills they need to "apply their knowledge" (practice their professions).

Drucker says the most striking growth will be in knowledge technologists [5]: computer technicians, software developers, clinical lab analysts, manufacturing technologists, and paralegals. These people work as much or more with their hands as with their brains. Their manual work is based on a substantial amount of theoretical knowledge acquired only through a formal education, not through an apprenticeship. Although they are not paid better than most skilled workers, they see themselves as professionals. Drucker says that knowledge technologists are likely to become the dominant social (and perhaps political) force over the next decades.


We can generate more funding for research if we can better connect our projects to the value our research will produce for the customer (funder).


While the education-vs.-training distinction was useful to define the mission of a university as the source of knowledge workers, it now stands in the way of a serious discussion about the role of universities in educating knowledge technologists. Another popular linear distinction, principles vs. practice, also suffers the same shortcoming: knowledge technologists must be skilled at both.

Back to Top

The Value Dimension

These examples of moving to two-dimensional interpretations from one-dimensional bear a striking similarity. One of the dimensions measures technology, the other value. Value is not an absolute that can be defined precisely as can technology; it can only be assessed in the context of actual transactions between the practitioner and customers, users, or clients. Therefore, value is a separate dimension and not a subset of the technological. Value is a necessary but separate dimension of skills. A previous column defined the value skills involved, and made a case that the 21st-century professional needs strength in both technical skills and in value skills [2]. Figure 3 depicts our previous claim: the leader-professional corresponds to the quadrant with high technical and value skills. The technician corresponds to the (high, low) quadrant, the technical assistant (help desk, sales) corresponds to the (low, high) quadrant.

The conclusion is that the value dimension is essential for successful research, software development, and professional mastery. This dimension is not in opposition to those of fundamental knowledge, technical excellence, or knowledge basis of work. It complements and enriches those other dimensions. More importantly, it reveals new, major challenges for researchers, software developers, and educators.

The value dimension is even important for those engaged in basic research. Remember that research for fundamental knowledge and technological advancement must have funders, therefore customers. Anyone who has competed for a research grant knows this. We can generate more funding for research if we can better connect our projects to the value our research will produce for the customer (funder).

Readers are cautioned that the two-dimensional examples offered here are only interpretations. Many other two-dimensional pictures can be explored; further interpretations might result from additional dimensions. The two-dimensional interpretations reveal the limitations of linear thinking and show us new actions. They do more than provide a new way to look at what is already there; they show the central importance of customer value and clarify the problems that result when projects become disconnected from customers. The technical and value skills sets are different; they rely on different systems of education and training; they are both important to professional success. It is time to bring the value dimension out of the shadows and make it operational. We cannot afford to be trapped in a one-dimensional world.

Back to Top

References

1. Boehm, B. Get ready for agile methods, with care. IEEE Computer (Jan. 2002), 64–69.

2. Denning, P. and Horning, J. Risks of linear thinking. Commun. ACM 45, 3 (Mar. 2002).

3. Denning, P. and Dunham, R. The core of a third-wave professional. Commun. ACM 44, 11 (Nov. 2001), 21–25.

4. Dertouzos, M. The Unfinished Revolution. Harper Collins, 2001.

5. Drucker, P. The next society. The Economist (Nov. 3, 2001)

6. National Research Council. Academic careers for experimental computer scientists. NRC Press, 1994.

7. Stokes, D. Pasteur's Quadrant: Basic Science and Technological Innovation. Brookings Institution,1997.

8. Toffler, A. The Third Wave. Bantam Books, 1980.

Back to Top

Author

Peter Denning (cs.gmu.edu/faculty/denning) is past president of ACM and chair of the ACM Education Board.

Back to Top

Figures

F1Figure 1. Research quadrants.

F2Figure 2. Software industry quadrants.

F3Figure 3. Professional quadrants (CEO = chief executive officer; CST = customer-savvy technician).

Back to top


©2002 ACM  0002-0782/02/0600  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2002 ACM, Inc.


 

No entries found