A Library-Based Approach to Task Parallelism in a Data-Parallel Language

Ian Foster*, David R. Kohr, Rakesh Krishnaiyer, Alok Choudhary

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

25 Scopus citations


Pure data-parallel languages such as High Performance Fortran version 1 (HPF) do not allow efficient expression of mixed task/data-parallel computations or the coupling of separately compiled data-parallel modules. In this paper, we show how these common parallel program structures can be represented, with only minor extensions to the HPF model, by using a coordination library based on the Message Passing Interface (MPI). This library allows data-parallel tasks to exchange distributed data structures using calls to simple communication functions. We present microbenchmark results that characterize the performance of this library and that quantify the impact of optimizations that allow reuse of communication schedules in common situations. In addition, results from two-dimensional FFT, convolution, and multiblock programs demonstrate that the HPF/MPI library can provide performance superior to that of pure HPF. We conclude that this synergistic combination of two parallel programming standards represents a useful approach to task parallelism in a data-parallel framework, increasing the range of problems addressable in HPF without requiring complex compiler technology.

Original languageEnglish (US)
Pages (from-to)148-158
Number of pages11
JournalJournal of Parallel and Distributed Computing
Issue number2
StatePublished - Sep 15 1997

ASJC Scopus subject areas

  • Software
  • Theoretical Computer Science
  • Hardware and Architecture
  • Computer Networks and Communications
  • Artificial Intelligence


Dive into the research topics of 'A Library-Based Approach to Task Parallelism in a Data-Parallel Language'. Together they form a unique fingerprint.

Cite this