Search papers, labs, and topics across Lattice.
This paper introduces a stereo visual SLAM system for lunar rovers that integrates learned feature detection/matching with global constraints from Digital Elevation Models (DEMs). The front-end uses learned features to improve robustness to illumination and repetitive terrain, while the back-end incorporates DEM-derived height and surface-normal factors into a pose graph. Validated on simulated and real Moon/Mars analog data, the system demonstrates reduced absolute trajectory error compared to baseline SLAM, mitigating long-term drift in challenging lunar environments.
Lunar rovers can now navigate more accurately across vast distances thanks to a new SLAM system that uses readily available Digital Elevation Models to correct visual drift.
Future lunar missions will require autonomous rovers capable of traversing tens of kilometers across challenging terrain while maintaining accurate localization and producing globally consistent maps. However, the absence of global positioning systems, extreme illumination, and low-texture regolith make long-range navigation on the Moon particularly difficult, as visual-inertial odometry pipelines accumulate drift over extended traverses. To address this challenge, we present a stereo visual simultaneous localization and mapping (SLAM) system that integrates learned feature detection and matching with global constraints from digital elevation models (DEMs). Our front-end employs learning-based feature extraction and matching to achieve robustness to illumination extremes and repetitive terrain, while the back-end incorporates DEM-derived height and surface-normal factors into a pose graph, providing absolute surface constraints that mitigate long-term drift. We validate our approach using both simulated lunar traverse data generated in Unreal Engine and real Moon/Mars analog data collected from Mt. Etna. Results demonstrate that DEM anchoring consistently reduces absolute trajectory error compared to baseline SLAM methods, lowering drift in long-range navigation even in repetitive or visually aliased terrain.