Functional languages are referentially transparent and have no state.
This feature has been exploited in various ways to gain parallelism.
These languages try to gain parallelism by performing computations in
parallel before their results are known to be required.
-
MultiLisp. Lisp with futures. A future is a promise to
provide the value of a computation if needed; pointers to the future
may be freely manipulated. Run-time system decides when futures should
create new processes. An introduction to Multilisp [Hal85]
and developments in MultiLisp, particularly the difficulties of
implementing both futures and explicit continuations (
call/cc) [Hal90].
-
Qlisp. Provides qlet and qlambda
constructs for spawning parallel processes. User supplies predicates
for determining whether these should create new
processes [GG88].
-
Mul-T. Similar to Multilisp but code is compiled rather than
interpreted; has improved task
management/creation [KHM89].
A functional program can be represented as a graph of
combinators [Tur79]. Reduction of this graph has proven to
lead to efficient schemes for sequential evaluation of lazy functional
programs. Proponents claim that graph reduction is inherently
parallel, but there has been little in the way of efficient parallel
implementation.
- MIMD graph reduction [Pey89].
- Graphinators and SIMD graph reduction [HM88].
- Everything you wanted to know about lazy functional
languages but were afraid to ask [Pey87].
Dataflow languages are single-assignment languages, the idea being to
execute them in a data-driven manner on (usually) special-purpose
hardware.
- Motivation for the dataflow approach [Arv89].
- Id. A dataflow language based on speculative computation
I-structures and M-structures [Nik90].
- Sisal. Discusses the language, the compiler, the runtime system, and
performance numbers [FCO90].
Guy.Blelloch@cs.cmu.edu, July 1994