A Sliding Window Filter with GNSS-State Constraint for RTK-Visual-Inertial Navigation. paper link
Authors: Xiaohong Huang, Cui Yang, Miaowen Wen
RTK-Visual-Inertial-Navigation is a navigation system that tightly fuses GNSS, visual, and inertial measurements. It uses a sliding window filter (SWF) with GNSS-state constraints for sensor fusion. That is, the GNSS states (i.e., position, orientation, and velocity of the body and inertial biases at the time of capturing GNSS measurements) are retained in the SWF to construct more appropriate constraints between measurements and states. It also uses the parallel elimination strategy in a predefined elimination ordering, which can solve the Gauss-Newton problem and simultaneously obtain the covariance for ambiguity resolution. The system can perform the following types of navigation:
- RTK-Visual-Inertial Navigation;
- RTD-Visual-Inertial Navigation;
- SPP-Visual-Inertial Navigation;
- SPP-Visual-Inertial Navigation with Carrier-Phase Fusion
- Visual-Inertial navigation.
This package requires some features of C++11.
This package is developed under ROS Kinetic environment.
Our code uses Opencv 3 for image process.
Clone the repository to your catkin workspace (for example ~/catkin_ws/
):
cd ~/catkin_ws/src/
git clone https://github.com/xiaohong-huang/RTK-Visual-Inertial-Navigation.git
In our source code, we have developed our solving strategy based on Ceres-Solver. The original version of Ceres-Solver is not satisfied for our project. To build the project, you need to build our modified Ceres-Solver with:
# CMake
sudo apt-get install cmake
# Eigen3
sudo apt-get install libeigen3-dev
# Ceres-Solver-Modified
cd ~/catkin_ws/src/RTK-Visual-Inertial-Navigation
tar -xvf ceres-solver-modified.tar
cd ceres-solver-modified/
sh build.sh
The modified version will only be installed in the workspace's folder. So you don't need to worry that the installation will change the settings of your computer.
Then build the package with:
cd ~/catkin_ws/
catkin_make
Our equipment is shown as follows: A grayscale camera (MT9V034 752x480@25HZ), a MEMS-grade IMU (BMI088 400HZ), a
Download our Dataset and launch the rviz via:
source ~/catkin_ws/devel/setup.bash
roslaunch rtk_visual_inertial rtk_visual_inertial_rviz.launch
Open another terminal and run the project by:
source ~/catkin_ws/devel/setup.bash
rosrun rtk_visual_inertial rtk_visual_inertial_node src/RTK-Visual-Inertial-Navigation/yaml/SETTING.yaml YOUR_BAG_FOLDER/BAG_NAME.bag ourput.csv
YOUR_BAG_FOLDER is the folder where you save our dataset. BAG_NAME is the name of our dataset. SETTING.yaml is the setting for RTK-Visual-Inertial-Navigation. You could use the following settings to perform different types of navigation.
rtk_visual_inertial_config.yaml #RTK-Visual-Inertial-Navigation
rtd_visual_inertial_config.yaml #RTD-Visual-Inertial-Navigation
spp_visual_inertial_config.yaml #SPP-Visual-Inertial-Navigation
spp_CP_visual_inertial_config.yaml #SPP-Visual-Inertial-Navigation with carrier-phase fusion
visual_inertial_config.yaml #Visual-Inertial-Navigation
We have also provide a demo for evaluating the positioning errors (see evaluate.py).
4. Run RTK-Visual-Inertial-Navigation in Jetson-TX2 and Orangepi5
We have provide an efficient version of RTK-Visual-Inertial-Navigation in RTK-Visual-Inertial-Navigation-JetsonTX2 and RTK-Visual-Inertial-Navigation-Orangepi5. The RTK-Visual-Inertial-Navigation-JetsonTX2 is able to achieve real-time state estimation with a state update rate of 20~25Hz in Jetson-Tx2. And The RTK-Visual-Inertial-Navigation-Orangepi5 is able to achieve real-time state estimation with a state update rate of 25Hz in Orangepi5. RTK-Visual-Inertial-Navigation-JetsonTX2 and RTK-Visual-Inertial-Navigation-Orangepi5 can all generate the same positioning results as the current project with less consumption time.
The VIO framework is adapted from VINS-Mono. The Ceres-Solver-Modified is developed base on Ceres-Solver.
The source code is released under GPLv3 license.