A Parallel Complexity Model for Functional Languages

Guy Blelloch and John Greiner
CMU-CS-94-196

100k compressed postscript

Abstract: A complexity model based on the lambda-calculus with an appropriate operational semantics in presented and related to various parallel machine models, including the PRAM and hypercube models. The model is used to study parallel algorithms in the context of ``sequential'' functional languages, and to relate these results to algorithms designed directly for parallel machine models. For example, the paper shows that equally good upper bounds can be achieved for merging two sorted sequences in the pure lambda-calculus with some arithmetic constants as in the EREW PRAM, when they are both mapped onto a more realistic machine such as a hypercube or butterfly network. In particular for n keys and p processors, they both result in an O(n/p + log^2 p) time algorithm. These results argue that it is possible to get good parallelism in functional languages without adding explicitly parallel constructs. In fact, the lack of random access seems to be a bigger problem than the lack of parallelism.

@techreport{complexity,
	author = "Guy~E. Blelloch and John Greiner",
	title = "A Parallel Complexity Model for Functional Languages",
	institution = "School of Computer Science, Carnegie Mellon University",
	number = "CMU-CS-94-196",
	month = oct,
	year = 1994 }