In this manuscript, we tackle the problem of a continuous localization of a legged robot. We propose a novel, optimization-based procedure for the state estimation of the robot using measurements from internal sensors (legged odometry). Then, we propose the optimization-based integration of the legged odometry and the visual SLAM output. The proposed multi-modal localization system can continuously estimate the pose of the robot in various conditions despite fast motions of the robot, slippages or image motion blur. We provide the results of the real-time implementation of the proposed method on a multi-legged walking robot. We compare the proposed localization method to other state of the art localization systems and provide the dataset for future comparisons.

References:
[1]  D. Belter, M. R. Nowicki, Optimization-based legged odometry and sensor fusion for legged robot continuous localization, Robotics and Autonomous Systems
Vol. 111, pp. 110-124, 2019 (pdf)