MidasTouch: Monte-Carlo inference over distributions across sliding touch

Download: PDF.

“MidasTouch: Monte-Carlo inference over distributions across sliding touch” by S. Suresh, Z. Si, S. Anderson, M. Kaess, and M. Mukadam. In Proc. Conf. on Robot Learning, CoRL, (Auckland, New Zealand), Dec. 2022.

Abstract

We present MidasTouch, a tactile perception system for online global localization of a vision-based touch sensor sliding on an object surface. This framework takes in posed tactile images over time, and outputs an evolving distribution of sensor pose on the objectâs surface, without the need for visual priors. Our key insight is to estimate local surface geometry with tactile sensing, learn a compact representation for it, and disambiguate these signals over a long time horizon. The backbone of MidasTouch is a Monte-Carlo particle filter, with a measurement model based on a tactile code network learned from tactile simulation. This network, inspired by LIDAR place recognition, compactly summarizes local surface geometries. These generated codes are efficiently compared against a precomputed tactile codebook per-object, to update the pose distribution. We further release the YCB-Slide dataset of real-world and simulated forceful sliding interactions between a vision-based tactile sensor and standard YCB objects. While single-touch localization can be inherently ambiguous, we can quickly localize our sensor by traversing salient surface geometries.

Download: PDF.

BibTeX entry:

@inproceedings{Suresh22corl,
   author = {S. Suresh and Z. Si and S. Anderson and M. Kaess and M. Mukadam},
   title = {{MidasTouch}: Monte-{C}arlo inference over distributions
	across sliding touch},
   booktitle = {Proc. Conf. on Robot Learning, CoRL},
   address = {Auckland, New Zealand},
   month = dec,
   year = {2022}
}
Last updated: March 21, 2023