acm-header
Sign In

Communications of the ACM

ACM Careers

Numerical Simulations Shed New Light on Early ­niverse


View as: Print Mobile App Share:
Big Bang illustration

Los Alamos scientists' BURST computer code predicts with precision the amounts of light nuclei synthesized in the Big Bang.

Credit: Los Alamos National Laboratory

Innovative multidisciplinary research in nuclear and particle physics and cosmology has led to the development of a new, more accurate computer code to study the early universe. The code simulates conditions during the first few minutes of cosmological evolution to model the role of neutrinos, nuclei, and other particles in shaping the early universe.

Anticipating precision cosmological data from the next generation of "Extremely Large" telescopes, the BURST code developed by scientists at Los Alamos National Laboratory in collaboration with colleagues at University of California San Diego, "promises to open up new avenues for investigating existing puzzles of cosmology," says Los Alamos physicist Mark Paris of the Nuclear and Particle, Astrophysics and Cosmology group. "These include the nature and origin of visible matter and the properties of the more mysterious 'dark matter' and 'dark radiation.'"

"The BURST computer code allows physicists to exploit the early universe as a laboratory to study the effect of fundamental particles present in the early universe," Paris says. "Our new work in neutrino cosmology allows the study of the microscopic, quantum nature of fundamental particles — the basic, subatomic building blocks of nature — by simulating the universe at its largest, cosmological scale," Paris says.

"The frontiers of fundamental physics have traditionally been studied with particle colliders, such as the Large Hadron Collider at CERN, by smashing together subatomic particles at great energies," says UCSD physicist George Fuller, who collaborated with Paris and other staff scientists at Los Alamos to develop the novel theoretical model. BURST brings a new dimension in simulations. "Our 'self-consistent' approach, achieved for the first time by simultaneously describing all the particles involved, increases the precision of our calculations. This allows us to investigate exotic fundamental particles that are currently the subject of intense theoretical speculation."

The theoretical work is described in "Neutrino Energy Transport in Weak Decoupling and Big Bang Nucleosynthesis," published in the journal Physical Review D.

The research is driven by several mission goals of Los Alamos' Nuclear and Particle Futures research pillar in basic and applied nuclear science. According to Paris, "The early universe is becoming such a tightly constrained environment with increasingly good measurements that we can test our descriptions of microscopic quantum physics, such as nuclear cross sections, to high accuracy." These cross sections are important for Los Alamos' nuclear data needs that feed into applications in nuclear energy, safety, and security."

A few seconds after the Big Bang, the universe was composed of a thick, 10-billion degree "cosmic soup" of subatomic particles. As the hot universe expanded, these particles' mutual interactions caused the universe to behave as a cooling thermonuclear reactor. This reactor produced light nuclei, such as hydrogen, helium, and lithium, found in the universe today. And the amounts of the light nuclei created depend on what other particles — such as neutrinos and perhaps their exotic cousins, "sterile" neutrinos — comprise the "soup" and how they interact with each other.

"Neutrinos are very interesting — they're the second most abundant particle in the universe after photons yet we still have much to learn about them," says Evan Grohs, who earned his Ph.D. through UCSD for the work, while working on the project in the Center for Space and Earth Sciences at Los Alamos. "By comparing our calculations with cosmological observables, such as the deuterium abundance, we can use our BURST computer code to test theories regarding neutrinos, along with other — even less understood — particles," Grohs says. "It can be difficult to test these theories in terrestrial labs, so our work provides a window into an otherwise inaccessible area of physics."

This research has become possible only recently with the advent of astronomers' precision measurements of the amounts of nuclei present in the early universe. These measurements were made with "Very Large" telescopes, which are about 10-meters wide. The next generation of "Extremely Large" telescopes, 30-meters across, are currently under construction.

"With coming improvements in cosmological observations, we expect our BURST computer code to be useful for many years to come," says Paris. Improvements in BURST are planned that will exploit the precision cosmological observations to reveal even more exotic physics such as the nature of dark matter and dark radiation. A complete understanding of dark matter, which comprises about a quarter of the mass in the universe, is currently lacking, Paris says.

Ongoing support for the project is provided by the U.S. National Science Foundation at UCSD and the Laboratory Directed Research and Development program through the Center for Space and Earth Sciences at Los Alamos. Supercomputing resources on the TriLab Linux Capacity Cluster systems at Los Alamos National Laboratory were provided through a Los Alamos Institutional Computing grant.

The Physical Review D article is authored by E. Grohs, G.M. Fuller, C.T. Kishimoto, M.W. Paris, and A. Vlasenko.


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account