acm-header
Sign In

Communications of the ACM

ACM News

Advancing the Nuclear Enterprise Through Better Computing


View as: Print Mobile App Share:

Scientists at the Nuclear Science and Technology Division of the U.S. Department of Energy's Oak Ridge National Laboratory (ORNL) are merging decades of nuclear energy and safety expertise with high-performance computing to effectively address a range of nuclear energy- and security-related challenges.

John Wagner, Technical Integration Manager for Nuclear Modeling within ORNL's Nuclear Science and Technology Division (NSTD), says one of the goals of his organization is to integrate existing nuclear energy and nuclear national security modeling and simulation capabilities and associated expertise with high-performance computing to solve problems that were previously unthinkable or impractical in terms of the computing power required to address them.

In the area of nuclear energy, the Nuclear Modeling staff specializes in developing and applying computational methods and software for simulating radiation in order to support the design and safety of nuclear facilities, improve reactor core designs and nuclear fuel performance, and ensure the safety of nuclear materials, such as spent nuclear fuel. The Nuclear Modeling staff is internationally known for developing and maintaining SCALE, a comprehensive nuclear analysis software package originally developed for the Nuclear Regulatory Commission with signature capabilities in the criticality safety, reactor physics and radiation shielding areas. In recent years, ORNL has placed an emphasis on transforming its current capabilities through high-performance computing, as well as the development of new and novel computational methods.

"Traditionally, reactor models for radiation dose assessments have considered just the reactor core, or a small part of the core," Wagner says. "However, we're now simulating entire nuclear facilities, such as a nuclear power reactor facility with its auxiliary buildings and the ITER fusion reactor, with much greater accuracy than any other organization that we’re aware of." More accurate models enable nuclear plants to be designed with more accurate safety margins and shielding requirements, which helps to improve safety and reduce costs. The technology that makes this sort of leading-edge simulation possible is a combination of ORNL's Jaguar, the world's fastest supercomputer; advanced transport methods; and a next-generation software package called Denovo.

"At first we tried adapting older software to the task, but we abandoned that idea pretty quickly," says NSTD scientist and Denovo creator Tom Evans. As a result of that decision, Evans started from scratch to develop new software that is far more efficient at creating computer models on state-of-the-art supercomputers. Evans observes that, in some ways, Denovo is a synthesis of the last decade of research in the field. "Software for modeling radiation transport has been around for a long time," he says, "but it hadn't been adapted to build on developments that have revolutionized computational science. There's no special transformational technology in this software; but it's designed specifically to take advantage of the massive computational and memory capabilities of the world's fastest computers."

Denovo, which appropriately enough means "from new" or "from scratch" was recently awarded 8 million processor hours on the Jaguar supercomputer by the DOE Office of Science's Innovative and Novel Computational Impact on Theory and Experiment (INCITE) program to develop a uniquely detailed simulation of the power distribution inside a nuclear reactor core. This simulation will be used to help to design the next generation of reactors by expediting experiments that can take years to conduct and to help to ensure that reactor designs are as efficient as possible.

Wagner notes that Denovo provides a fundamental capability for radiation transport modeling that continues to be expanded and applied to numerous ORNL projects. However, he is also quick to point out that these computer simulations will not completely eliminate the need for experimental or measurement data to confirm or "validate" the software. Instead, the new generation of nuclear modeling will increase confidence in the results using a more limited set of physical data. "We want to develop a predictive capability that has increased accuracy, reliability and flexibility," he says, "that can be used to improve our knowledge and understanding and increase our confidence in the decisions we make about design, safety, and performance of nuclear facilities. That's the goal."


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account