acm-header
Sign In

Communications of the ACM

ACM TechNews

The Computing Power Behind the Large Hadron Collider


View as: Print Mobile App Share:
A person stands in front of the ATLAS detector, one of six detectors attached to the Large Hadron Collider

The Large Hadron Collider can generate as much of 6 gigabytes of data per second during an experiment, amounting to 30 petabytes of data annually.

Credit: Maximilien Brice/CERN

Bob Jones, project manager at CERN, which operates the Large Hadron Collider (LHC), recently attended ADMA's Advancing Analytics conference in Sydney, Australia, to discuss the LHC's massive supporting computing infrastructure.

The LHC accelerates particle beams at each other at 99.99-percent of the speed of light in order to study their collisions with high-tech digital cameras that take 40 million photographs per second. Automatic filtering of this data results in 6 gigabytes of data being written to permanent storage every second during an experiment, and 30 petabytes of data annually.

To handle this massive volume of data, CERN employs a network of about a dozen major data centers around the world, as well as an array of smaller data centers, for a total of 170. CERN's two major data centers, located in Geneva and Budapest, are connected by the world's fastest network, which supports speeds of up to 100 gigabytes per second. "What this means is we can operate that infrastructure as a single cloud deployment, it's now operated as a single open stack," Jones notes.

The LHC's massive computing infrastructure also is active when the collider is idle, running simulations to check the quality of the collected data. Most of the data eventually is stored on magnetic cassettes, which have to be replaced every two years because the tapes deteriorate over time.

From CIO Australia
View Full Article

 

Abstracts Copyright © 2015 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account