Research

HomeResearch

MINS: Efficient Multi-sensor Aided Inertial Navigation with Online Calibration

International Conference on Robotics and Automation (ICRA), 2021

Versatile multi-sensor aided inertial navigation system (MINS) that can efficiently fuse multi-modal measurements of IMU, camera, wheel encoder, GPS, and 3D LiDAR along with online spatiotemporal sensor calibration. Plane patch-based efficient LiDAR point cloud information abstraction and data association.

Publication reference – Paper

Versatile 3D Multi-Sensor Fusion for Lightweight 2D Localization

International Conference on Intelligent Robots and Systems (IROS), 2020

Lightweight and robust localization solution for low-cost, low-power autonomous robot platforms. Two-stage localization system which incorporates both offline prior map building and online multi-modal localization.

Publication reference – Paper,

VIWO: Visual-inertial-wheel odometry with online calibration

International Conference on Intelligent Robots and Systems (IROS), 2020

A novel visual-inertial-wheel odometry (VIWO) system for ground vehicles, which efficiently fuses multi-modal visual, inertial, and 2D wheel odometry measurements in a sliding-window filtering fashion. Perform VIWO along with online sensor calibration of wheel encoders’ intrinsic and extrinsic parameters.

Publication reference – Paper, Tech report

Intermittent GPS-aided VIO with Online Calibration and Initialization

International Conference on Robotics and Automation (ICRA), 2020

IMU-camera data with intermittent GPS measurements fusion. Online calibration method for both the GPS-IMU extrinsics and time offset as well as a reference frame initialization procedure that is robust to GPS sensor noise.

Publication reference – Paper, Tech report

OpenVINS

International Conference on Robotics and Automation (ICRA), 2020

An open platform, termed OpenVINS, for visual-inertial estimation research for both the academic community and practitioners from industry. This codebase has out of the box support for commonly desired visual-inertial estimation features, which include: (i) on-manifold sliding window Kalman filter, (ii) online camera intrinsic and extrinsic calibration, (iii) camera to inertial sensor time offset calibration, (iv) SLAM landmarks with different representations and consistent First-Estimates Jacobian (FEJ) treatments, (v) modular type system for state management, (vi) extendable visual-inertial system simulator, and (vii) extensive toolbox for algorithm evaluation.
Publication reference – Paper, GitHub

LIC-Fusion: LiDAR-Inertial-Visual Odometry

International Conference on Intelligent Robots and Systems (IROS), 2019

Tightly-coupled multi-sensor fusion algorithm of LiDAR-inertial-camera (LIC) odometry. Perform online spatial and temporal sensor calibration between all three sensors. Track sparse edge/surf feature points over LiDAR scans and then fuse their measurements along with the visual feature observations in the efficient MSCKF framework.

Publication reference – Paper