Pose from Contact Motion

Yan-Bin Jia and Michael Erdmann


In the absence of vision, grasping an object often relies on tactile feedback from the fingertips. As the finger pushes the object, the fingertip can feel the contact point move. From this motion the finger can infer the location of the contact point on the object and thereby the object pose. This paper primarily investigates the problem of determining the pose (orientation and position) of a planar object from such contact motion generated by pushing.

A dynamic analysis of pushing yields a nonlinear system that relates through contact the object pose and motion to the finger motion. The contact motion on the fingertip thus encodes certain information about the object pose. Nonlinear observability theory is employed to show that such information is sufficient for the finger to ``observe'' not only the pose but also the motion of the object. Therefore a sensing strategy can be realized as an observer of the nonlinear dynamical system. Two observers are subsequently introduced. The first observer, based on the result of~\cite{GauthierHO92}, has its ``gain'' determined by the solution of a Lyapunov-like equation; it can be activated at any time instant during a push. The second observer, based on Newton's method, solves for the initial (motionless) object pose from three intermediate contact points during a push.

Under the Coulomb friction model, the paper copes with support friction in the plane and/or contact friction between the finger and the object. Extensive simulations have been done to demonstrate the feasibility of the two observers. Preliminary experiments (with an Adept robot) have also been conducted.

Inspired by the way a human hand touches, this work may serve as a primitive step in exploring interactive sensing in grasping tasks. From a more general perspective, it presents an approach for acquiring geometric and dynamical information about a task from a small amount of tactile data, with the application of nonlinear observability theory.