Search papers, labs, and topics across Lattice.
The paper introduces SLAM Adversarial Lab (SAL), a modular framework for evaluating the robustness of visual SLAM systems against adversarial conditions like fog and rain. SAL transforms existing datasets into adversarial datasets using perturbations with severity levels defined in real-world units. The framework's extensible architecture decouples datasets, perturbations, and SLAM algorithms, enabling easy integration of new components and automated search for failure points.
Find the exact level of fog, rain, or camera distortion that will break your visual SLAM system with this new framework.
We present SAL (SLAM Adversarial Lab), a modular framework for evaluating visual SLAM systems under adversarial conditions such as fog and rain. SAL represents each adversarial condition as a perturbation that transforms an existing dataset into an adversarial dataset. When transforming a dataset, SAL supports severity levels using easily-interpretable real-world units such as meters for fog visibility. SAL's extensible architecture decouples datasets, perturbations, and SLAM algorithms through common interfaces, so users can add new components without rewriting integration code. Moreover, SAL includes a search procedure that finds the severity level of a perturbation at which a SLAM system fails. To showcase the capabilities of SAL, our evaluation integrates seven SLAM algorithms and evaluates them across three datasets under weather, camera, and video transport perturbations.