While recent years have seen dramatic progress in the development of affordable, general-purpose robot hardware, the capabilities of that hardware far exceed our ability to write software to adequately control. The key challenge here is one of abstraction: generally capable behavior requires high-level reasoning and planning, but perception and actuation must ultimately be performed using noisy, high-bandwidth, low-level sensors and effectors. I will describe a research program aimed at constructing robot control hierarchies through the use of learned motor skills. The first part of my talk will address methods for automatically discovering, learning, and reusing motor skills, both autonomously and via demonstration. The second part of the talk will establish a link between the skills available to a robot and the abstract representations it should use to plan with them. I will present an example of a robot autonomously learning a (sound and complete) abstract representation directly from sensorimotor data, and then using it to plan.
George Konidaris is an Assistant Professor of Computer Science at Brown and Chief Roboticist of Realtime Robotics, a startup commercializing his work on hardware-accelerated motion planning. He holds a BScHons from the University of the Witwatersrand, an MSc from the University of Edinburgh, and a PhD from the University of Massachusetts Amherst. Prior to joining Brown, he held a faculty position at Duke and was a postdoctoral researcher at MIT. George is the recent recipient of young faculty awards from DARPA and the AFOSR.
Faculty Host: Oliver Kroemer