Communication and memory requirements as the basis for mapping task and data parallel programs

Jaspal Subhlok*, David R. O'Hallaron, Thomas Gross, Peter A. Dinda, Jon Webb

*Corresponding author for this work

Research output: Contribution to journalConference articlepeer-review

14 Scopus citations

Abstract

For a wide variety of applications, both task and data parallelism must be exploited to achieve the best possible performance on a multicomputer. Recent research has underlined the importance of exploiting task and data parallelism in a single compiler framework, and such a compiler can map a single source program in many different ways onto a parallel machine. The tradeoffs between task and data parallelism are complex and depend on the characteristics of the program to be executed, most significantly the memory and communication requirements, and the performance parameters of the target parallel machine. In this paper, we present a framework to isolate and examine the specific characteristics of programs that determine the performance for different mappings. Our focus is on applications that process a stream of input, and whose computation structure is fairly static and predictable. We describe three such applications that were developed with our compiler: fast Fourier transforms, narrowband tracking radar, and multibaseline stereo. We examine the tradeoffs between various mappings for them and show how the framework is used to obtain efficient mappings.

Original languageEnglish (US)
Pages (from-to)330-339
Number of pages10
JournalProceedings of the ACM/IEEE Supercomputing Conference
DOIs
StatePublished - 1994
EventProceedings of the 1994 Supercomputing Conference - Washington, DC, USA
Duration: Nov 14 1994Nov 18 1994

ASJC Scopus subject areas

  • Electrical and Electronic Engineering

Fingerprint

Dive into the research topics of 'Communication and memory requirements as the basis for mapping task and data parallel programs'. Together they form a unique fingerprint.

Cite this