The government shutdown is over and the country is no longer in a state of limbo, but the future of supercomputing in the United States remains up in the air.
Federal funding for research at large is at a 10-year low, says Dan Reed, vice president for Research and Economic Development at the University of Iowa, so the shutdown was "a temporary blip. The bigger question is what will happen to research funding in the long term as part of these battles."
In a blog he wrote on the future of exascale computing, Reed, who is also the university’s Computational Science and Bioinformatics chair, and a professor of Computer Science and Electrical and Computer Engineering, discussed how the global race to build increasingly faster supercomputers is "fueled by a combination of scientific and engineering needs to simulate phenomena with greater resolution and fidelity, continued advances in semiconductor capabilities, and economic and political competition." This competition pits the United States against China, Japan, and the European Union for bragging rights over having the first system capable of sustaining 1,000 petaflops, or one quintillion floating-point operations per second, he notes.
Complicating the effort is that, while high-performance computing is caught in the funding crossfire like other governmental programs, unlike other large-scale instruments like telescopes with operational lifetimes of 10-20 years, a supercomputer could be obsolete in five years, Reed says.
Horst Simon believes this is the crux of the problem as well. Supercomputers take about three years to procure, since each one is a "customized process," says Simon, deputy laboratory director of the Lawrence Berkeley National Laboratory; it then takes a couple of years to put the system in place and make it usable, and after a few years of use, it becomes obsolete. "That means today, in fall 2013, I have a fairly good [picture] of the next big systems that will be acquired in the U.S., and the shocking news is very few of these very top systems will be installed in the next two to three years."
Looking at the broader ecosystem of computing technology, the U.S. historically has relied upon the ability to put forth technological advances for military superiority, security, and the economy, says Reed. Continued reduced funding puts the future of supercomputing and, in particular, exascale computing, in jeopardy, he says.
Simon paints an equally gloomy picture of the immediate outlook for funding. "Since probably 2010, we’ve been in a very difficult position, and it has only gotten worse with fiscal years 2012 and 2013,’’ with 2014 not looking any better, says Simon.
Back in 2006, Simon and his colleagues conducted a series of town hall meetings to discuss exascale computing, which they believed to be the next great challenge, and the scientific applications that would benefit from the availability of exascale resources in computing. The hope was by 2010, the exascale computing program would move forward, which didn’t happen because of the economic recession.
Today, says Simon, "We still don’t have a national exascale program formulated … we’ve been in a three-year hiatus of standing still."
Like Simon and Reed, Jay Boisseau, director of the Texas Advanced Computing Center (TACC) at the University of Texas at Austin, says the government shutdown had no significant impact on TACC. "Compared to the investments needed for exascale computing by the end of decade, it’s insignificant."
For centers like TACC that receive National Science Foundation (NSF) grants, "it’s business as usual," Boisseau says. "We are trying to achieve higher and higher performance." TACC’s most recent NSF grant, earmarked for researching a large-scale petascale system, was approved right before the shutdown, he says. The only way the shutdown would have impacted the future of exascale computing research would have been if it had lasted indefinitely; in that event, he says, it could have "result[ed] in an increasingly conservative perspective on long-term research and development."
As a scientist, Boisseau says he hopes there will be no appreciable change in the government’s long-term investment in R&D strategy, since funding is crucial to the ability to stay competitive. "The ability to fund long-term research is, of course, instrumental in major changes and discoveries that can occur."
Building an exascale computer in and of itself is not the highest priority, he says; the goal is making the discoveries that could result from having that computing capability. The combined priority is to invest both in the system and in the people who build it, according to Boisseau.
"I hope the U.S. gets there first [to an exascale computer], but the important thing is investing in the entire science enterprise," he says. "It’s possible another country could get there first, but that does not mean they will have a greater impact on innovation discovery and science overall, if the focus is on deploying that system."
Boisseau says this is why he worries about flat budgets and cuts; there could still be investments made in building an exascale computer with flat budgets, but cuts could have an impact on currently well-funded R&D teams. "I hope we don’t end up with this shutdown producing some kind of sea change that makes us significantly more conservative in spending on long-term research and development." Any federal budget cuts could impact our ability for long-term competitiveness.
The dilemma in computing is that systems are applicable to almost every industry in a way that many scientific instruments are not, but often politically, while computing is important, "it’s everyone’s second choice’’ when it comes to funding, says Reed. "If you are a physicist, your first choice is investment in physics instruments. If you are a biomedical researcher, your first choice would be funding in biomedical instruments … but both of those cannot do research without advanced computing."
The U.S. Department of Energy has to make tradeoffs, since there are many priorities competing for a limited pool of funding, all of which are worthy from a scientific perspective. "There are too many needs and not enough dollars. That is the ‘rock and a hard place’ high-performance computing finds itself in," says Reed.
Even with the shutdown over, the bigger issue remains determining the national will to invest in the research and enabling technologies that have historically contributed to economic growth, he says. "We live in an information economy," and the ability to obtain new knowledge is what drives a longer economic impact.
Says Simon, "I hope that saner minds will understand the importance of investing in basic science in order for national competitiveness to be possible.’’
Esther Shein is a freelance technology and business writer based in the Boston area.
No entries found