The National Energy Research Scientific Computing Center's John Shalf describes parallel programming languages as tools designed to program systems with multiple processors and thus multiple concurrent instruction threads. He projects that all future computer speed upgrades will be derived from parallelism, as chips' clock frequencies are no longer increasing.
Shalf says that a program that runs in parallel can be created using a sequential programming language, and notes that some of the most commonly used parallel programming strategies exploit the syntax of existing sequential languages. He is concerned "that serial languages do not provide the necessary semantic guarantees about locality of effect that is necessary for efficient parallelism. Ornamenting the language to insert the semantics of such guarantees . . . is arduous, prone to error, and quite frankly not very intuitive."
Shalf expects a resurgence in implicit parallelism and constructs formulated from functional languages, and says the most important development looking ahead is the migration of parallelism notions from an academic problem to a mainstream challenge. "This means it is even more imperative that we train future computer scientists to solve problems using parallelism from the get-go," he says.
From International Science Grid This Week
View Full Article
Abstracts Copyright © 2010 Information Inc., Bethesda, Maryland, USA
No entries found