Lawrence Livermore National Laboratory (LLNL) recently made Catalyst, a first-of-a-kind supercomputer, available to industry collaborators to test big data technologies, architectures, and applications.
"We have modified the Cray CS300 architecture in ways that make Catalyst an outstanding [high-performance computing] platform for data-intensive computing," says LLNL's Robin Goldstone.
Catalyst is equipped with 128 GB of dynamic random access memory (DRAM) per node, 800 GB of non-volatile memory (NVRAM) per compute node, 3.2 TB of NVRAM per Lustre router node, and improved cluster networking with dual rail Quad Data Rate Intel TrueScale fabrics. The system's expanded DRAM and fast, persistent NVRAM are designed to handle a wide range of big data problems, including bioinformatics, business analytics, machine learning, and natural language processing. For example, LLNL researcher Jonathan Allen is using Catalyst to develop new methods to rapidly detect and characterize pathogenic organisms such as viruses, bacteria, or fungi in a biological sample.
"Catalyst allows us to explore entirely new deep-learning architectures that could have a huge impact on video analytics as well as broader application to big data analytics," says LLNL researcher Doug Poland.
From Lawrence Livermore National Laboratory
View Full Article
Abstracts Copyright © 2014 Information Inc., Bethesda, Maryland, USA
No entries found