Search papers, labs, and topics across Lattice.
TacLoc is introduced as a novel tactile localization framework that addresses robotic pose estimation when visual perception is limited. It formulates tactile localization as a one-shot point cloud registration problem, using a graph-theoretic partial-to-full registration method. The framework leverages dense point clouds and surface normals from tactile sensing, enhanced by normal-guided graph pruning and a hypothesis-and-verification pipeline, to achieve efficient and accurate pose estimation without relying on simulation or pre-trained models.
Ditch the pre-trained models: TacLoc achieves accurate robotic pose estimation from tactile sensing alone by framing it as a one-shot point cloud registration problem.
Pose estimation is essential for robotic manipulation, particularly when visual perception is occluded during gripper-object interactions. Existing tactile-based methods generally rely on tactile simulation or pre-trained models, which limits their generalizability and efficiency. In this study, we propose TacLoc, a novel tactile localization framework that formulates the problem as a one-shot point cloud registration task. TacLoc introduces a graph-theoretic partial-to-full registration method, leveraging dense point clouds and surface normals from tactile sensing for efficient and accurate pose estimation. Without requiring rendered data or pre-trained models, TacLoc achieves improved performance through normal-guided graph pruning and a hypothesis-and-verification pipeline. TacLoc is evaluated extensively on the YCB dataset. We further demonstrate TacLoc on real-world objects across two different visual-tactile sensors.