With an ever-growing demand to automate diﬀerent day-to-day activities, the task of autonomous manipulation using articulated robots has gained serious traction lately. In this regard, motion planning for manipulation is one of the highly researched topics. The Motion planning for manipulation is often cast as either a model-based planning problem or a machine learning problem. However, both of these approaches have their own shortcomings. Model-based planning, as the name suggests, requires a good understanding of the model to produce reliable plans. The task of developing a good model can be often laborious and intractable. While learning techniques do not suffer from this problem, they often require vast amounts of quality training data to generalize well over the entire state-space. In this work, we address these problems by developing a planning framework that overcomes the drawback of the model-based planning by leveraging the learned skills.
We develop a search-based planning framework that combines skills learned from demonstrations and an a priori model to come up with the best plan possible. We validate the performance of our framework by applying it to the domain of full body motion planning for a 12-DoF articulated mobile robot. We show that our framework is capable of generating robust plans which would otherwise be impossible to generate. The speciﬁc problem that we look at in this thesis is the manipulation of articulated objects whose models are unknown while the skills needed to interact with them are known.
Maxim Likhachev (Chair)
Sung Kyun Kim