acm-header
Sign In

Communications of the ACM

ACM TechNews

New Blueprint For Converging Hpc, Big Data


View as: Print Mobile App Share:
Binary code.

A group of high-performance computing experts has released the Big Data and Extreme-Scale Computing Pathways to Convergence Report.

Credit: Shutterstock

A group of high-performance computing (HPC) experts has released the Big Data and Extreme-Scale Computing Pathways to Convergence Report as a blueprint for aligning computational infrastructures so HPC and big data are supported.

The experts cite big data's role in worsening paradigm splits between traditional "HPC and high-end data analysis" and "stateless networks and stateful services" delivered by end systems.

The report says addressing them requires new standards governing the interoperability between data and compute, based on a new, common, and open Distributed Services Platform (DSP).

The report's recommendations for decentralized edge and peripheral ecosystems include creating a new hourglass DSP framework, targeting workflow patterns for better data logistics, designing cloud stream processing capabilities for HPC, advocating a scalable approach to Content Delivery/Distribution Networks, and developing software libraries for common intermediate processing tasks.

Among the report's actionable conclusions for centralized facilities are recognizing the need to adjust HPC architectures to accommodate machine-learning-driven scientific workloads.

From HPCwire
View Full Article

 

Abstracts Copyright © 2018 Information Inc., Bethesda, Maryland, USA


 

No entries found

Sign In for Full Access
» Forgot Password? » Create an ACM Web Account