The exascale initiative has an ambitious goal: to develop supercomputers a hundred times more powerful than today's systems.
That's the kind of speed that can help scientists make serious breakthroughs in solar and sustainable energy technology, weather forecasting, batteries, and more.
Last year, President Obama announced a unified National Strategic Computing Initiative to support U.S. leadership in high-performance computing; one key objective is to pave the road toward an exascale computing system.
The U.S. Department of Energy has been charged with carrying out that role in an initiative called the Exascale Computing Project.
Argonne National Laboratory Distinguished Fellow Paul Messina has been tapped to lead the project, heading a team with representation from the six major participating DOE national laboratories: Argonne, Los Alamos, Lawrence Berkeley, Lawrence Livermore, Oak Ridge, and Sandia. The project program office is located at Oak Ridge.
Messina, who has made fundamental contributions to modern scientific computing and networking and previously served as the Director of Science for the Argonne Leadership Computing Facility, a DOE Office of Science User Facility, for eight years, will now help usher in a new generation of supercomputers with the capabilities to change our everyday lives.
Exascale-level computing could have an impact on almost everything, Messina says. It can help increase the efficiency of wind farms by determining the best locations and arrangements of turbines, as well as optimizing the design of the turbines themselves. It can also help severe weather forecasters make their models more accurate and could boost research in solar energy, nuclear energy, biofuels, and combustion, among many other fields.
"For example, it's clear from some of our pilot projects that exascale computing power could help us make real progress on batteries," Messina says.
Brute computing force is not sufficient, however. "We also need mathematical models that better represent phenomena and algorithms that can efficiently implement those models on the new computer architectures," Messina says.
Given those advances, researchers will be able to sort through the massive number of chemical combinations and reactions to identify good candidates for new batteries.
"Computing can help us optimize. For example, let's say that we know we want a manganese cathode with this electrolyte; with these new supercomputers, we can more easily find the optimal chemical compositions and proportions for each," he says.
Exascale computing will help researchers get a handle on what's happening inside systems where the chemistry and physics are extremely complex. To stick with the battery example: the behavior of liquids and components within a working battery is intricate and constantly changing as the battery ages.
"We use approximations in many of our calculations to make the computational load lighter," Messina says, "but what if we could afford to use the more accurate — but more computationally expensive — methods?"
In addition, Messina says that one of the project's goals is to boost U.S. industry, so the Exascale Computing Project will be working with companies to make sure the project is in step with their goals and needs.
The project will focus its efforts on four areas, Messina says.
The applications software to tackle these larger computing challenges will often evolve from current codes, but will need substantial work, Messina says.
First, simulating more challenging problems will require some brand-new methods and algorithms. Second, the architectures of these new computers will be different from the ones available today, so to be able to use existing codes effectively, the codes will have to be modified. This is a daunting task for many of the teams that use scientific supercomputers today.
"These are huge, complex applications, often with literally millions of lines of code," Messina says. "Maybe they took the team 500 person-years to write, and now you need to modify it to take advantage of new architectures, or even translate it into a different programming language."
The project will support teams that can provide the people-power to tackle a number of applications of interest, he says. For example, data-intensive calculations are expected to be increasingly important and will require new software and hardware features.
The goal is to have "mission-critical" applications to be ready when the first exascale systems are deployed, Messina says.
The teams will also identify both what new supporting software is needed, and ways that the hardware design could be improved to work with that software before the computers themselves are ever built. This "co-design" element is central for reaching the full potential of exascale, he says.
"The software ecosystem will need to evolve both to support new functionality demanded by applications and to use new hardware features efficiently," Messina says.
The project will enhance the software stack that DOE Office of Science and NNSA applications rely on and evolve it for exascale, as well as conduct R&D on tools and methods to boost productivity and portability between systems.
For example, many tasks are the same from scientific application to application and are embodied as elements of software libraries. Teams writing new code use the libraries for efficiency — "so you don't have to be an expert in every single thing," Messina says.
"Thus, improving libraries that do numerical tasks or visualizations, data analytics and program languages, for example, would benefit many different users," he says.
Teams working on these components will work closely with the applications taskforce, he says. "We'll need good communication between these teams so everyone knows what's needed and how to use the tools provided."
In addition, as researchers are able to get more and more data from experiments, they'll need software infrastructure to more effectively deal with that data.
While the computers themselves are massive, they aren't a big part of the commercial market.
"Scientific computers are a niche market, so we make our own specs to get the best results for computational science applications," Messina says. "That's what we do with most of our scientific supercomputers, including here at Argonne when we collaborated with IBM and Lawrence Livermore National Laboratory on the design of Mira, and we believe it really paid off."
This segment will work with computer vendors and hardware technology providers to accelerate the development of particular features for scientific and engineering applications — not just those of interest to the DOE, but also priorities for other federal agencies, academia, and industry, Messina says.
Supercomputers need very special accommodations — they can't sit just anywhere. They need a good deal of electricity and cooling infrastructure; they take up a fair amount of square footage, and all of their flooring needs to be reinforced. This effort will work to develop sites for computers with this kind of footprint.
The Exascale Computing Project is a complex project with many stakeholders and moving parts, Messina says. "The challenge will be to effectively coordinate activities in many different sites in a relatively short time frame — but the rewards are clear."
The project will be jointly funded by the U.S. Department of Energy's Office of Science and the National Nuclear Security Administration's Office of Defense Programs.
Argonne Distinguished Fellow Paul Messina has been tapped to lead the DOE and NNSA's Exascale Computing Project with the goal of paving the way toward exascale supercomputing.
No entries found