Active Tactile Exploration for Rigid Body Pose and Shape Estimation

Ethan K. Gordon1, Bruke Baraki1, Hien Bui1, Michael Posa1,
1University of Pennsylvania

Our system learns dynamic rigid bodies from <10s of tactile data.

Abstract

General robot manipulation requires the handling of previously unseen objects. Learning a physically accurate model at test time can provide significant benefits in data efficiency, predictability, and reuse between tasks. Tactile sensing can compliment vision with its robustness to occlusion, but its temporal sparsity necessitates careful online exploration to maintain data efficiency. Direct contact can also cause an unrestrained object to move, requiring both shape \emph{and} location estimation. In this work, we propose a learning and exploration framework that uses only tactile data to simultaneously determine the shape and location of rigid objects with minimal robot motion. We build on recent advances in contact-rich system identification to formulate a loss function that penalizes physical constraint violation without introducing the numerical stiffness inherent in rigid-body contact. Optimizing this loss, we can learn cuboid and convex polyhedral geometries with less than 10s of randomly collected data after first contact. Our exploration scheme seeks to maximize Expected Information Gain and results in significantly faster learning in both simulated and real-robot experiments.

Video

BibTeX

@article{gordon2026active,
  title={Active Tactile Exploration for Rigid Body Pose and Shape Estimation},
  author={Gordon, Ethan K. and Baraki, Bruke and Bui, Hien and Posa, Michael},
  journal={arXiv preprint arXiv:XXXX.XXXXX},
  year={2026}
}