Oak Ridge National Laboratory (ORNL) supercomputers running models to assess climate change ramifications and mitigation tactics are rapidly generating a wide variety of big data in vast volumes.
ORNL's Galen Shipman says climate researchers have significantly boosted the temporal and spatial resolution of climate models as well as their physical and biogeochemical complexity, contributing to the amount of data produced by the models. Shipman notes it often takes weeks or months to analyze the climate models' data sets with traditional analysis tools, and the Department of Energy's (DOE's) Office of Biological and Environmental Research (BER) is striving to address this challenge through multiple projects that have yielded parallel analysis and visualization tools.
Shipman also says substantial efforts have been made to deliver the infrastructure to support the geographically distributed data, especially between DOE supercomputing centers, while DOE BER continues to invest in the software technologies needed to maintain a distributed data archive with multiple petabytes of climate data stored worldwide through the Earth System Grid Federation project. Shipman says data movement is the biggest challenge for most current visualization workloads, and he cites in situ analysis where visualization and analysis are embedded within the simulation as a promising approach.
From HPC Wire
View Full Article
Abstracts Copyright © 2012 Information Inc., Bethesda, Maryland, USA
No entries found