CERN researchers generate a petabyte of data every second as they work to discover the origins of the universe by smashing particles together at close to the speed of light. However, the researchers, led by Francois Briard, only store about 25 petabytes every year because they use filters to save just the results they are interested in.
"To analyze this amount of data you need the equivalent of 100,000 of the world's fastest PC processors," says CERN's Jean-Michel Jouanigot. "CERN provides around 20 percent of this capability in our data centers, but it's not enough to handle this data."
The researchers worked with the European Commission to develop the Grid, which provides access to computing resources from around the world. CERN receives data center use from 11 different providers on the Grid, including from companies in the United States, Canada, Italy, France, and Britain. The data comes from four machines on the Large Hadron Collider in which the particle collisions are monitored, which transmit data at 320 Mbps, 100 Mbps, 220 Mbps, and 500 Mbps, respectively, to the CERN computer center.
From V3.co.uk
View Full Article
Abstracts Copyright © 2011 Information Inc. , Bethesda, Maryland, USA
No entries found