Skip to content

LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping

License

Notifications You must be signed in to change notification settings

BEAMRobotics/LVI-SAM

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

23 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

LVI-SAM

This repository contains code for a lidar-visual-inertial odometry and mapping system, which combines the advantages of LIO-SAM and Vins-Mono at a system level.

drawing


Dependency

  • ROS (Tested with kinetic and melodic)
  • gtsam (Georgia Tech Smoothing and Mapping library)
    install using install_gtsam() in beam_install_scripts. Ensure GTSAM_VERSION="4.0.2"
    
  • Ceres (C++ library for modeling and solving large, complicated optimization problems)
    install using install_ceres() in beam_install_scripts
    

Compile

You can use the following commands to download and compile the package.

cd ~/catkin_ws/src
git clone git@github.com:BEAMRobotics/LVI-SAM.git
cd ..
catkin build

Datasets

drawing

The datasets used in the paper can be downloaded from Google Drive. The data-gathering sensor suite includes: Velodyne VLP-16 lidar, FLIR BFS-U3-04S2M-CS camera, MicroStrain 3DM-GX5-25 IMU, and Reach RS+ GPS.

https://drive.google.com/drive/folders/1q2NZnsgNmezFemoxhHnrDnp1JV_bqrgV?usp=sharing

Note that the images in the provided bag files are in compressed format. So a decompression command is added at the last line of launch/module_sam.launch. If your own bag records the raw image data, please comment this line out.

drawing drawing


Run the package

  1. Configure parameters:
Configure sensor parameters in the .yaml files in the ```config``` folder.
  1. Run the launch file:
roslaunch lvi_sam run.launch
  1. Play existing bag files:
rosbag play handheld.bag -r 0.5 

TODO

  • Update graph optimization using all three factors in imuPreintegration.cpp, simplify mapOptimization.cpp, increase system stability

Paper

Thank you for citing our paper if you use any of this code or datasets.

@inproceedings{lvisam2021shan,
  title={LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping},
  author={Shan, Tixiao and Englot, Brendan and Ratti, Carlo and Rus Daniela},
  booktitle={IEEE International Conference on Robotics and Automation (ICRA)},
  pages={to-be-added},
  year={2021},
  organization={IEEE}
}

Acknowledgement

  • The visual-inertial odometry module is adapted from Vins-Mono.
  • The lidar-inertial odometry module is adapted from LIO-SAM.

About

LVI-SAM: Tightly-coupled Lidar-Visual-Inertial Odometry via Smoothing and Mapping

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • C++ 99.0%
  • Other 1.0%