Part 1 of this article (to be found here, please read it first) made fun of authors who claim that software engineering is a total failure — and, like everyone else, benefit from powerful software at every step of their daily lives.
Catastrophism in talking about software has a long history. Software engineering started around 1966 [1] with the recognition of a "software crisis". For many years it was customary to start any article on any software topic by a lament about the terrible situation of the field, leaving in your reader’s mind the implicit suggestion that the solution to the "crisis" lay in your article little language or tool.
After the field had matured, this lugubrious style went out of fashion. In fact, it is hard to sustain: in a world where every device we use, every move we make and every service we receive is powered by software, it seems a bit silly to claim that in software development everyone is wrong and everything is broken.
The apocalyptic mode has, however, made a comeback of late in the agile literature, which is fond in particular of citing the so-called "Chaos" reports. These reports, emanating from Standish, a consulting firm, purport to show that a large percentage of projects either do not produce anything or do not meet their objectives. It was fashionable to cite Standish (I even included a citation in a 2003 article), until the methodology and the results were thoroughly debunked starting in 2006 [2, 3, 4]. The Chaos findings are not replicated by other studies, and the data is not available to the public. Capers Jones, for one, publishes his sources and has much more credible results.
Yet the Chaos results continue to be reverently cited as justification for agile processes, including, at length, in the most recent book by the creators of Scrum [5].
Not long ago, I raised the issue with a well-known software engineering author who was using the Standish findings in a talk. Wasn't he aware of the shakiness of these results? His answer was that we don't have anything better. It did not sound like the kind of justification we should use in a mature discipline. Either the results are sound, or we should not rely on them.
Software engineering is hard enough and faces enough obstacles, so obvious to everyone in the industry and to every user of software products, that we do not need to conjure up imaginary scandals and paint a picture of general desolation and universal (except for us, that is) incompetence. Take Schwaber and Sutherland, in their introductory chapter:
"You have been ill served by the software industry for 40 years—not purposely, but inextricably. We want to restore the partnership."
No less!
Pretending that the whole field is a disaster and everyone else is wrong may be a good way to attract attention (for a while), but it is infantile as well as dishonest. Such gross exaggerations discredit their authors, and beyond them, the ideas they promote, good ones included.
As software engineers, we can in fact feel some pride when we look at the world around us and see how much our profession has contributed to it.
Yes, challenges and unsolved problems face us at every corner of software engineering. Yes, we are still at the beginning, and on many topics we do not even know how to proceed. Yes, there are lots of things to criticize in current practices (and I am not the least vocal of the critics). But we need a sense of measure. Software theories, methods, tools and languages have made tremendous progress over the last five decades; neither the magnitude of the remaining problems nor the urge to sell one's products and services justifies slandering the rest of the discipline.
[1] No, not in 1968 with the NATO conference, as everyone seems to believe. That canard was refuted long ago. See my short article about this issue here.
[2] Robert L. Glass: The Standish report: does it really describe a software crisis?, in Communications of the ACM, vol. 49, no. 8, pages 15-16, August 2006; see here.
[3] J. Laurens Eveleens and Chris Verhoef: The Rise and Fall of the Chaos Report Figures, in IEEE Software, vol. 27, no. 1, Jan-Feb 2010, pages 30-36; see here.
[4] S. Aidane, The "Chaos Report" Myth Busters, 26 March 2010, see here.
[5] Ken Schwaber and Jeff Sutherland: Software in 30 Days: How Agile Managers Beat the Odds, Delight Their Customers, And Leave Competitors In the Dust, Wiley, 2012.
Here's a reference to more respectable work on software project failure rates:
El Emam, K.; Koru, A.G., "A Replicated Survey of IT Software Project Failures," Software, IEEE , vol.25, no.5, pp.84,90, Sept.-Oct. 2008
doi: 10.1109/MS.2008.107
Displaying 1 comment