Skip to content

fpthink/V2B

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

3D Siamese Voxel-to-BEV Tracker for Sparse Point Clouds

Note that we create a new branch named "master" and set it as the default branch. We have reorganized our code and the new branch will continue to be updated. The original default branch "main" is also retained, but is not updated.

Introduction

This repository is released for V2B in our NeurIPS 2021 paper (poster).

Note: In order to make the code structure clearer and more reasonable, we refactored the entire project. If you are more familiar with P2B and the code of our previously published version, you can continue to refer to the code of the first version.

Environment settings

  • Create an environment for v2b
conda create -n V2B python=3.7
conda activate V2B
  • Install pytorch and torchvision
conda install pytorch==1.4.0 torchvision==0.5.0 cudatoolkit=10.0
  • Install dependencies.
pip install -r requirements.txt
  • Build _ext module.
cd V2B_main/lib/pointops
python setup.py install
cd ../../

Data preparation

  • Download the Full dataset (v1.0) from nuScenes.

    Note that base on the offical code nuscenes-devkit, we modify and use it to convert nuScenes format to KITTI format. It requires metadata from nuScenes-lidarseg. Thus, you should replace category.json and lidarseg.json in the Full dataset (v1.0). We provide these two json files in the nuscenes_json folder.

    Executing the following code to convert nuScenes format to KITTI format

    cd nuscenes-devkit-master/python-sdk/nuscenes/scripts
    python export_kitti.py --nusc_dir=<nuScenes dataset path> --nusc_kitti_dir=<output dir> --split=<dataset split>
    

    Note that the parameter of "split" should be "train_track" or "val". In our paper, we use the model trained on the KITTI dataset to evaluate the generalization of the model on the nuScenes dataset.

  • We follow the benchmark created by LiDAR-SOT based on the waymo open dataset. You can download and process the waymo dataset as guided by their code, and use our code to test model performance on this benchmark.
  • The benchmark they built have many things that we don't use, but the following processing results are necessary:
[waymo_sot]
    [benchmark]
        [validation]
            [vehicle]
                bench_list.json
                easy.json
                medium.json
                hard.json
            [pedestrian]
                bench_list.json
                easy.json
                medium.json
                hard.json
    [pc]
        [raw_pc]
            Here are some segment.npz files containing raw point cloud data
    [gt_info]
        Here are some segment.npz files containing tracklet and bbox data

Node: After you get the dataset, please modify the path variable data_dir&val_data_dir about the dataset under configuration file V2B_main/utils/options.

Evaluation

Train a new model:

python main.py --which_dataset KITTI/NUSCENES --category_name category_name

Test a model:

python main.py --which_dataset KITTI/NUSCENES/WAYMO --category_name category_name --train_test test

For more preset parameters or command debugging parameters, please refer to the relevant code and change it according to your needs.

Visualization

cd V2B_main/visualization/
python visual.py

Note that for convenience, we provide the trained_model_kitti_car.pth. It can be directly downloaded and placed in the path "V2B_main/visualization/data" to show visualization results.

Citation

If you find the code or trained models useful, please consider citing:

@inproceedings{hui2021v2b,
  title={3D Siamese Voxel-to-BEV Tracker for Sparse Point Clouds},
  author={Hui, Le and Wang, Lingpeng and Cheng, Mingmei and Xie, Jin and Yang, Jian},
  booktitle={NeurIPS},
  year={2021}
}

Acknowledgements

  • Thank Qi for his implementation of P2B.
  • Thank Pang for the 3D-SOT benchmark based on the waymo open dataset.

License

This repository is released under MIT License.