Date: Wed, 20 Nov 1996 22:57:17 GMT Server: NCSA/1.5 Content-type: text/html Last-modified: Thu, 10 Nov 1994 20:37:22 GMT Content-length: 2856
Complexity theory, the study of the fundamental laws and limitations that govern computations, is not only an inherently rich mathematical subject but one whose conclusions and methodology are increasingly relevant to practical computational problems. Professor Valiant's research is focused on two application areas, machine learning and parallel computation, as well as the core area.
A machine that has a learning capability can augment its knowledge by interacting with its environment; no programmer need understand the machine's present, possibly unmanageably complicated state of knowledge. The goals of Professor Valiant's current research are to derive models that capture the phenomenon of learning, particularly the learning of concepts from examples, and to use these models to identify efficient learning algorithms, as well as the ultimate limits to computational learning. For example, in the area of parallel computation, how efficient and easy to program can general purpose machines be made? In this connection, Professor Valiant and his colleagues have obtained some encouraging results concerning the possibility of constructing general-purpose, multiprocessor computers capable of executing parallel algorithms in close to logically optimal time. They are also searching for efficient or optimal parallel algorithms for important computational problems.
In the core area of complexity theory, relationships of surprising generality can sometimes be obtained. For example, relationships have been found between the difficulty of finding solutions to combinatorial problems in the case when a unique solution is guaranteed to the more general case when it is not; other work relates the problem of randomly generating one solution, from possibly exponentially many, to the problem of counting these solutions.