Researchers at IBM's laboratories in Zurich have developed a new algorithm that can sort, correlate, and analyze millions of random data sets in minutes. Without the algorithm, the analysis would have taken days for supercomputers to process, says IBM researcher Costas Bekas.
Bekas says the algorithm could be used to analyze data measuring electricity usage and air or water pollution levels. The algorithm also could be used to break down data from global financial markets. The algorithm combines models of data calibration and statistical analysis that can assess measurement models and relationships between data sets.
Bekas says the algorithm, which can analyze nine terabytes of data in less than 20 minutes, makes data analysis more cost and energy efficient because it reduces the load on supercomputers.
From IDG News Service
View Full Article
Abstracts Copyright © 2010 Information Inc., Bethesda, Maryland, USA
No entries found