Server: Netscape-Commerce/1.12 Date: Tuesday, 26-Nov-96 00:06:42 GMT Last-modified: Thursday, 15-Jun-95 00:35:44 GMT Content-length: 4767 Content-type: text/html COMPUTATION STRUCTURES GROUP

Arvind,
Charles W. and Jennifer C. Johnson
Professor of Electrical Engineering
and Computer Science
Jack Dennis,
Professor of Electrical Engineering
and Computer Science, Emeritus

The Computation Structures group (CSG) researches high-speed, general-purpose parallel computing. One of our goals is to guide the architecture of parallel computers by a language/programming model that is general enough to support a large class of applications. Our research also emphasizes the effective management of machine resources on large multiprocessors.

One example is the recently completed Monsoon project, conducted jointly with Motorola. Monsoon marries a dataflow processor with Id, a declarative programming language designed and implemented at LCS. At Id's core is a functional programming language with non-strict semantics, higher-order functions, a polymorphic-type system, and powerful constructors for building lists and arrays. Id also provides a novel way of dealing with state in the form of I-structures and M-structures.

Greg Papadopoulos,
Associate Professor
of Electrical Engineering
and Computer Science
Jack Dennis,
Andy Boughton,
Research Associate,
Arvind,
Yuli Zhou
, Research Associate

The Monsoon processor architecture uses dataflow principles to address the basic issues of memory access latency and the synchronization of concurrent threads of control. This is done efficiently using multithreading, split-phase operations, and hardware support for synchronization. Multithreading allows the processor to execute instructions of different threads while one thread is suspended by a remote memory access. Special memory operations are implemented to support synchronization between producers and consumers of data structures.

In contrast to Id's declarative programming style, most parallel programming uses conventional imperative languages extended with constructs for parallel execution, or by a library of parallel programming operations. Since the programmer must partition the program into parallel segments, optimize the partition, and insert synchronization for determinacy, parallel programming using imperative languages is much more complex than that for a single-processor computer. In a declarative language, however, parallelism is specified implicitly. The programmer is therefore freed from these concerns, even when using existing algorithms.

An international group of computer scientists interested in declarative programming have recently joined forces to define a new common functional programming language, named Haskell. CSG is now melding Id's significant features into an extended "parallel Haskell," known as pH, to more effectively disseminate implicit parallel programming.

Much of our work is on developing compilation techniques for implicitly parallel programs expressed in languages such as pH. The goal is to give the programmer freedom in devising abstractions for structuring applications without slowing their execution. Id's mathematically clean semantics makes it possible to perform significant program transformations over large sections of code.

We are collaborating with Motorola in the *T (Start) project, which strives to match a program model similar to that of the Monsoon multiprocessor to a parallel machine built of processors having more in common with conventional (RISC or superscalar) architectures. The project is designed to exploit the best combination of von Neumann and dataflow principles.