Search papers, labs, and topics across Lattice.
The paper introduces SPLIT, a simulation method for image-based tactile sensors like DIGIT and GelSight, that disentangles contact geometry from sensor-specific optical properties in the latent space. This disentanglement enables adaptation to new sensor backgrounds and transfer learning to different sensor types without full retraining, addressing the challenge of data scarcity in tactile sensing. SPLIT also features faster inference speeds and bidirectional simulation capabilities, reconstructing meshes from tactile images and vice versa.
Simulate once, deploy anywhere: SPLIT lets you train tactile perception models on synthetic data and transfer them across different sensors without retraining.
Training machine learning models for robotic tactile sensing requires vast amounts of data, yet obtaining realistic interaction data remains a challenge due to physical complexity and variability. Simulating tactile sensors is thus a crucial step in accelerating progress. This paper presents SPLIT, a novel method for simulating image-based tactile sensors, with a primary focus on the DIGIT sensor. Central to our approach is a latent space arithmetic strategy that explicitly disentangles contact geometry from sensor-specific optical properties. Unlike methods that require recalibration for every new unit, this disentanglement allows SPLIT to adapt to diverse DIGIT backgrounds and even transfer data to distinct sensors like the GelSight R1.5 without full model retraining. Beyond this adaptability, our approach achieves faster inference speeds than existing alternatives. Furthermore, we provide a calibrated finite element method (FEM) soft-body mesh simulation with variable resolution, offering a tunable trade-off between speed and fidelity. Additionally, our algorithm supports bidirectional simulation, allowing for both the generation of realistic images from deformation meshes and the reconstruction of meshes from tactile images. This versatility makes SPLIT a valuable tool for accelerating progress in robotic tactile sensing research.