Search papers, labs, and topics across Lattice.
This paper introduces a proprioceptive-only state estimation framework for legged robots that uses set-coverage statements to characterize measurement noise instead of assuming a Gaussian distribution. They learn a dynamics model from joint-level measurements to produce navigational measurements, then fuse these with IMU data. By using set-coverage, the proposed method achieves more consistent and drift-resistant state estimation compared to Gaussian-based approaches, particularly under real-world noise conditions.
Legged robots can navigate more reliably with noisy sensors thanks to a new state estimator that avoids Gaussian noise assumptions.
Proprioceptive-only state estimation is attractive for legged robots since it is computationally cheaper and is unaffected by perceptually degraded conditions. The history of joint-level measurements contains rich information that can be used to infer the dynamics of the system and subsequently produce navigational measurements. Recent approaches produce these estimates with learned measurement models and fuse with IMU data, under a Gaussian noise assumption. However, this assumption can easily break down with limited training data and render the estimates inconsistent and potentially divergent. In this work, we propose a proprioceptive-only state estimation framework for legged robots that characterizes the measurement noise using set-coverage statements that do not assume any distribution. We develop a practical and computationally inexpensive method to use these set-coverage measurements with a Gaussian filter in a systematic way. We validate the approach in both simulation and two real-world quadrupedal datasets. Comparison with the Gaussian baselines shows that our proposed method remains consistent and is not prone to drift under real noise scenarios.