 My interests include <a   href="http://cs-tr.cs.cornell.edu/TR/Search/?publisher=CORNELLCS&series=&number=&author=Donald%2C+Bruce&title=&abstract="> robotics,</a>  <a href="#mems"> microelectromechanical systems,</a> <a href="ftp://ftp.cs.cornell.edu/pub/brd/geometry.html"> geometric algorithms,</a> and artificial intelligence.  <a href="#information-invariants"> Robotics is the science that seeks  to forge an intelligent, computational connection between perception  and action.</a>  Working with graduate student Jim Jennings, research associate <a href="http://simlab.cs.cornell.edu/people/daniela.rus.html"> Daniela Rus,</a> graduate student <a href="http://www.cs.cornell.edu/Info/People/rbrown/rbrown.html"> Russell Brown</a>, and lab alumnus <a href="http://www-swiss.ai.mit.edu/~jar/jar.html"> Jonathan Rees</a> (now at MIT), we developed a <a href="#scheme"> team of autonomous mobile robots</a> that can perform sophisticated <a href="#distributed-manipulation"> distributed manipulation tasks</a> (such as <a href="#iros95"> moving furniture</a>).  The robots run robust SPMD protocols that are completely asynchronous and require no communication. With grad student <a href="http://www.cs.cornell.edu/Info/People/karl/home.html"> Karl B&ouml;hringer</a> and EE Professor Noel MacDonald, we are building a <a href="#mems"> massively parallel array of microactuators</a> in the Cornell National Nanofabrication Laboratory. <a href="http://www.cs.cornell.edu/Info/People/karl/Cinema/"> The array is a SCREAM chip containing over 11,000 actuators in 1 square centemeter,</a> and can orient small parts without sensory feedback. Our microfabricated actuator arrays could be used to construct programmable parts-feeders (at any scale), or to build self-propelled IC's (walking VLSI chips.)  Graduate student <a href="http://www.cs.cornell.edu/Info/People/briggs/briggs.html"> Amy Briggs</a> worked with Dan Huttenlocher's vision group to develop <a href="#amy"> a sensor planning and surveillance system for a team of mobile robots.</a> The robots use on-board vision to detect and intercept targets in the lab.  <h2>
