acm-header
Sign In

Communications of the ACM

Forum

Forum


In their column "An Introduction to Software Stability" ("Thinking Objectively," Sept. 2001), Fayad and Altman briefly suggest that more complex (and granular) models more accurately describe problems, and that the model (and the consequent software) becomes more durable/stable. There is also a distinction between modeling the problem and modeling the solution; in the first example given, it is the solution being modeled, not the problem, and they claim it is the wrong solution.

An interesting point the authors make (one made by many others in the past) is that you want to "identify aspects of the environment in which the software will operate that will not change and cater to those areas." Fairly basic stuff but something often ignored.

They start by talking about reengineering efforts in which complete rewrites of the system are performed to support changes. It seems to me, though I need to read between the lines to do so (always a dangerous thing), that the environments they mention tend toward serial development and not iterative and incremental development. This is just a guess, but if it's true, that's one indicator they're in a less-than-agile situation.

I don't think they sufficiently made the case that greater detail in the analysis model leads to greater stability. Understanding the problem space is always a good thing. Don't get me wrong, but a detailed analysis model doesn't necessarily get me to good, stable design. A better approach would be to model it enough to get an understanding of the problem domain, develop a design based on that understanding (perhaps as a model, or as tests, or as source code), then build the software based on that design. I would want to write quality code that's clean and understandable and follow agreed-on conventions. Now when the problem-space changes, and it always does, I have a clean base from which to work. If my design is good, the changes should be easy to accomplish; if my design was so good then I need to rework (refactor) it to support the new requirements.

Is this more agile or less agile? If we equate ease-of-change with agility, it is perhaps less agile. If, however, the agility we want is in quickly solving problems, then maybe having stable solutions to draw from improves our agility, because, perhaps, there is less to change to meet new requirements, and it turns out we were robust to the domain's semantics in the first place.

Seems to me it's an issue of value for your investment, as Fayad and Altman imply. Will greater investment in detailed analysis result in a better design requiring less investment to change in the future? Difficult to say, because I don't know what the future changes will be. I do know writing good clean code that fulfills current requirements is a good thing; I suspect that should be my fundamental goal. Will taking this approach to modeling help me achieve that goal in the best manner possible? I don't know, though I have no doubt that it will work for some people but not all.

Scott W. Ambler
Ontario, Canada

Back to Top

Exposing Agency Injustices

In her column "Suit Yourself" ("Staying Connected," Sept. 2001), Meg McGinity raises some serious questions. I was most interested in the remarks from the FDA and FCC. The excuse that "their staffs are too small to have any real effect on what a powerhouse [mobile phone] industry is doing" is ridiculous. What are they there for? In fact, this is the same excuse used by all so-called government watchdog agencies. Once in a while, one of them generates a lot of publicity from a lone violator, just to maintain the impression they are doing their jobs. For serious violations, they always claim their "hands are tied" for one reason or another. But they do little or nothing to get them untied.

The underfunding of most, if not all, federal watchdog agencies is no mistake. It has been going on for years; everyone in Washington is aware of it, and none of the politicians cares a wit about meaningful change. As a whole, the U.S. government has a long history of allowing business and other interests to determine policy.

According to a front-page story in USA Today (Sept. 6, 2000), during the late 1940s and 1950s, the government secretly allowed thousands of workers at nuclear weapons sites throughout the country to be exposed to radiation levels known to be harmful and above those legally allowed. Nevertheless, essentially nothing was done to stop or prevent this. A common excuse was that the plants were private.

During the Vietnam War, thousands of soldiers and civilians alike were exposed to Agent Orange; later, when they tried to get the government to take responsibility for the aftereffects, the response was stonewalling and denials. The same thing was repeated following the Gulf War in 1991.

Eventually, some politicians will expose these injustices to further their public images as well as claim that the practice has stopped. But the practice won't stop and all the so-called investigations and name-calling will serve only as cover to allow these things to continue.

John Jaros
Quakertown, PA

Back to Top

Targeting Software Project Goals

I really enjoyed the "Business of Software" column ("Zeppelins and Jet Planes: A Metaphor for Modern Software Projects," Oct. 2001). The analogy with intercepting supersonic planes with "smart" missiles was particularly apt. I've been telling people that while we can achieve goals, we really can't predict how we'll get to them. I heartily agree that giving up control, at least in the traditional sense, is both necessary and extremely difficult. We have to manage project goals—business goals—rather than trying to predict and manage to scope, schedule, and cost predictions.

What is really ironic is that by giving up control in the traditional sense, companies actually gain control in the sense of achieving their ultimate goals. But giving up control to get control is just too big a stretch for managers raised on PMI-style project management.

Jim Highsmith
Arlington, MA

I have used three techniques to reduce project cycle time: perform adequate needs analysis and design to reduce rework; reduce project scope to deliver less functionality to fewer users; and break projects into independent modules so developers can work in parallel without interference. A fourth method—lavishing resources on an army of coders—is applicable to a single, large software company with wide revenue streams.

Armour's column, like much recent opinion, asserts without proof that we "cannot estimate projects" and "cannot lock down requirements." If we say from the start these things are impossible, we excuse ourselves from substantial mental effort. We simply point our trendy, modern project at the sky and blast off in a blaze of coding. We may even feel mentally superior to the antiquated, linear way of doing things experienced practitioners suggest. Unanswered is the question: do such projects hit the target any more frequently?

Determining user needs is difficult. It takes skill, insight, and analysis. It also exposes our own early design ideas to rigorous testing that often finds them disconcertingly faulty. It requires communication with users we find less comfortable than coding. How much easier our lives are if we assert we cannot gain from this effort.

Scheduling is also difficult. It involves predicting the future, then betting our reputation on the prediction. We must then compromise with management, whose bias is always that projects need to take less time than predicted. How much less stressful it is to make no commitment at all.

Rather than asserting impossibility, we might acquire and use improved tools and methods for needs analysis and scheduling. We could then point our project directly at a clearly visible target from the moment of launch. This approach is conservative and laborious, not trendy and postmodern. But my personal experience is that it is predictably successful when honestly attempted.

Kurt Guntheroth
Seattle, WA

Back to Top

Taking a Stand on Copyright

The fact that ACM has filed a declaration in the case of Felton v. RIAA is remarkable ("The ACM Declaration in Felton v. RIAA," Oct. 2001) and indicates the importance of the matter, as ACM almost never takes a public position on a controversial matter. I urge all members to read Barbara Simons' "Viewpoint" and then make a big fuss about the subject with everybody. Although this is now a federal court case, the situation is actually political, so it would benefit us all to communicate with any political figures we can reach—members of Congress, lobbyists, party leaders, political contributors, and voters.

Eric A. Weiss
Kailua, HI

I am appalled by the treatment of Felten et al. Although I was aware of the attention Felten received, the importance of the issue did not reach me until reading Simons' "Viewpoint." I laud ACM for the declaration and Communications for publishing its details and background.

The Sept. 11 terrorist attacks have caused many of us to look carefully at what we value most about our way of life. Seeing U.S. companies use such strong-arm tactics in a (successful) attempt to suppress the expression of knowledge disturbs me deeply. As a scientist I worry about the chilling effect such fear of persecution could have on the amount and kind of publishing done in our community.

I wonder how we got to the point where business can suppress freedom of speech.

Steven Pothier
Tucson, AZ

Back to Top

Author

Please address all Forum correspondence to the Editor, Communications, 1515 Broadway, New York, NY 10036; email: [email protected].


©2001 ACM  0002-0782/01/1200  $5.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2001 ACM, Inc.