acm-header
Sign In

Communications of the ACM

Letters to the editor

Roots of Publication Delay


Letters to the Editor

Credit: iStockPhoto.com

Moshe Y. Vardi wrote in his Editor's Letter "Revisiting the Publication Culture in Computing Research" (Mar. 2010) about computer science being a field in which conferences are the primary venue for reporting research. Over the years, the premier conferences have evolved to where their acceptance rates are lower than the corresponding journals and their citation rates are higher. That is why, in the university, the other fields in the tenure councils accept the computer science argument that conference publications deserve the main weight.

When I first got involved in the ACM publication establishment (late 1960s) there was a terrible problem with backlogs in the two ACM research journals of the day: Communications and the Journal of the ACM. Editorial time—submission to final decision—was running 12–18 months, and publication time—decision to publication—took at least as long. Many loud complaints agreed that 24–36 months for the complete process was bad for such a fast-changing field. However, ACM was almost broke and could not afford more pages, and there was no online venue. Researchers began to look to the SIG conferences as a way to get their material into print sooner, much sooner. By the 1980s, some of these conferences were petitioning the Publication Board to designate them as "refereed" instead of "reviewed."

Studies by Bob Ashenhurst and John Rice documented the editorial delays. Even when editors were fastidious about pushing the review cycle along, the authors often took a long time to submit their revisions. Though individual reviews took a long time as well, the authors themselves contributed a lot of delay. The studies also found (with much less supporting data) that authors try different journals and conferences until they find a publisher, and that patient, persistent authors eventually publish over 90% of their papers. Rice calculated an average paper needs four referees, so authors "owe" their colleagues four reviews for every published paper.

A productive author publishing three journal papers a year would thus owe the field 12 reviews a year. Most researchers complain if they have to do half that number. Rice's conclusion, still valid today, was that as long as researchers do not want to do all those (timely) reviews, the editorial phase would not get significantly shorter and is simply the cost of an all-volunteer system.

In 1983 at the start of the Communications revitalization plan, we investigated how Science magazine gets its submissions reviewed so much quicker. We visited the editors and learned they phoned names from a reviewer database to ask for two-week turnaround. After locating three agreeable reviewers, the editors would FedEx them the manuscript.

We wanted to do likewise, but the ACM executive committee said it was way too expensive. At the time, ACM didn't have enough money to cover even the editors the Council had approved and could not mount a quick review process in Communications. Today's online tools make it much easier and less expensive to find reviewers, in which case the bottleneck is again the willingness of reviewers to quickly review and authors to quickly revise.

Peter J. Denning,
ACM Past President, 1980–1982,
Monterey, CA

A fundamental difference between the journal and conference publication systems, as discussed by Moshe Y. Vardi (Mar. 2010), is that conference leadership (chairs and program committees) changes much more frequently than (is practical) for journals. It is difficult (though perhaps not impossible) for a conference to consistently reflect the personal biases and hidden agendas of the same people over a long period of time. The same cannot be said of journals, which have much slower turnover of editors. Such stability allows journals to be, at least in principle, more coherent and less prone to fads, and perhaps less responsive to what the community views as important.

Angelos D. Keromytis,
New York

Back to Top

Multi-Departments to Cover the Whole Computational Discipline

Bjarne Stroustrup's Viewpoint "What Should We Teach New Software Developers? Why?" (Jan. 2010) was excellent in its call for curriculum reform but used the wrong model—that a single department is able to fulfill the needs of a mature computing discipline. Other disciplines recognize the need for a multi-departmental model with separate but interrelated departments to support both theory and industrial applications.

Chemistry, physics, and biology departments expand the boundaries of knowledge and create new tools that are then applied by chemical engineering, civil engineering, and clinical medicine departments in industrial settings. Computational (not computer) science departments must be centers of innovation, advancing the general principles of the discipline, while software engineering, IS, and IT departments build on the computational principles from the CS departments to prepare graduates needed by industry.

Undergraduates in chemistry and chemical engineering all take the same general-chemistry courses and labs as freshman, learning general principles with the help of test tubes and Bunsen burners. They then work at the laboratory scale required for research, while chemical-engineering students acquire the skills and knowledge of "bucket" chemistry required for industry.

Professionalism is an important goal, associated, as Stroustrup said, with the application side, not the theoretical side of the discipline. Licensing is for engineers, physicians, and pharmacists in the public domain, not for those working on theory. Coming to a consensus on the appropriate curriculum for different departments makes it easier to develop professional licensure.

A single department is no longer all things to all stakeholders. Needed instead is an ecosystem of interrelated departments supporting the range of theory and applications of the computational discipline.

Mark Segall,
Denver, CO

Back to Top

Event Processing for All

In its description of streaming queries, Julian Hyde's article "Data in Flight" (Jan. 2010) included a paragraph on the relationship between streaming queries and complex event processing, saying "CEP has been used within the industry as a blanket term to describe the entire field of streaming query systems. This is regrettable because it has resulted in a religious war between SQL-based and non-SQL-based vendors and, in overly focusing on financial services applications, has caused other application areas to be neglected."

Here, I'd like to offer some perspective on these three assertions:

On the relationship between event processing and streaming SQL. Event processing is a broad term, like data management and signal processing, dealing with computing that performs activities on events. A number of related programming styles are employed in the research community, as well as in commercial products, for implementing event-processing applications, including stream-oriented (based on SQL extensions2 and not based on SQL extensions1), script languages, and rule-based. The Event Processing Technical Society Language Analysis Workgroup gave a tutorial on the languages (http://www.slideshare.net/opher.etzion/debs2009-event-processing-languages-tutorial) at the 2009 ACM International Conference on Distributed Event-Based Systems (http://www.debs.org/2009).

On vendor competition. I have seen no religious wars over event-processing languages (unlike the classic one between Lisp and Prolog). Vendors compete but generally acknowledge the variety of ways to approach event-processing applications. They also collaborate through the Event Processing Technical society (http://www.ep-ts.com/), a consortium including most vendors in the area, as well as its leading academic people, to investigate common functions and steps toward a common modeling language.

On the range of applications. Some have asserted the field is overly focused on financial services to the exclusion all other industries. This might be true for some of the smaller vendors, but the field's major vendors report they develop applications across a range of industries, including health care, retail, defense, online games, chemical and petroleum, social computing, airline management, and car-fleet management. While applications in the financial services industry were early adopters, the claim of an overly narrow focus for the current state of the practice is simply not correct.

Opher Etzion,
Haifa, Israel

Back to Top

Author's Response:

Etzion's point that the Event Processing Technical Society offers a range of approaches to event-processing problems is well taken. However, in framing the problem as "event processing," the Society, as well as the complex event processing vendors, fails to address the needs of the data warehousing/business intelligence community.

Consider how such a BI practitioner might see the world: Business events are embodied as rows in a fact table, and events are captured in log files or through database triggers and transformed through an extract transform load tool or in the target database. SQL is the lingua franca. This perspective is different from the event-processing view, an important distinction because many large, complex problems are expressed in BI terms and because many people have BI skills.

BI technologies alone do not adequately address the needs of this community. Streaming query systems (such as SQL stream) allow them to solve important new problems.

Julian Hyde,
San Francisco

* Credit Line

The Communications cover illustration of Amir Pnueli (Jan. 2010) was inspired by a photograph taken by Dan Porges/Photo Shwartz.

Back to Top

References

1. Gedik, B., Andrade, H., Wu, K.-L., Yu, P.S., and Doo, M. SPADE: The system s declarative stream processing engine. In Proceedings of the SIGMOD Management of Data Conference (Vancouver, B.C., Canada, June 9–12). ACM Press, New York, 2008, 1123–1134.

2. Jain, N., Mishra, S., Srinivasan, A., Gehrke, J., Widom, J., Balakrishnan, H., Çetintemel, U., Cherniack, M., Tibbetts, R., and Zdonik, S.B. Towards a streaming SQL standard. In Proceedings of the 34th International Conference on Very Large Databases (Aukland, New Zealand, Aug. 23–28, 2008), 1379–1390.

Back to Top

Footnotes

Communications welcomes your opinion. To submit a Letter to the Editor, please limit your comments to 500 words or less and send to [email protected].

DOI: http://doi.acm.org/10.1145/1735223.1735226


©2010 ACM  0001-0782/10/0500  $10.00

Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. To copy otherwise, to republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee.

The Digital Library is published by the Association for Computing Machinery. Copyright © 2010 ACM, Inc.