Skip to content

AlignNet-3D: Fast Point Cloud Registration of Partially Observed Objects

License

Notifications You must be signed in to change notification settings

grossjohannes/AlignNet-3D

Repository files navigation

AlignNet-3D: Fast Point Cloud Registration of Partially Observed Objects

teaser

Introduction

This repository is code release for our 3DV 2019 paper (arXiv report here).

Citation

If you use our code or data, please cite

    @inproceedings{Gross193DV,
      author = {Johannes Gro{\ss} and Aljo\u{s}a O\u{s}ep and Bastian Leibe},
      title = {AlignNet-3D: Fast Point Cloud Registration of Partially Observed Objects},
      booktitle = {International Conference on 3D Vision (3DV)},
      year = {2019}
    }

If you use the data, please also cite the original dataset:

    @inproceedings{Geiger12CVPR,
      author = {Andreas Geiger and Philip Lenz and Raquel Urtasun},
      title = {Are we ready for Autonomous Driving? The KITTI Vision Benchmark Suite},
      booktitle = {Conference on Computer Vision and Pattern Recognition (CVPR)},
      year = {2012}
    }

Installation

Dataset preparation

data
│
└───SynthCars
│   │
│   └───meta
│   │   │   00000000.json
│   │   │   00000001.json
│   │   │   ...
│   │
│   └───pointcloud1
│   │   │   00000000.npy
│   │   │   00000001.npy
│   │   │   ...
│   │
│   ...
│
└───SynthCarsPersons
    │   ...

Running ICP evaluations

  • Specify your dataset folder (e.g. /home/gross/data) in make_icp_configs.py
  • Prepare the icp configs by running python make_icp_configs.py
  • Run all icp evaluations at once with ./eval_icp.sh

Training

  • Adapt logging.basedir in configs/default.json
  • Run e.g. python train.py --config configs/SynthCars.json
  • Models and evaluation results will be written to the specified logging.basedir
  • For models with pre-training from other models, adapt training.pretraining.model in the respective config files (e.g. configs/KITTITrackletsCars.json)

Evaluation

  • To run the evaluation (again) with an existing model checkpoint, run e.g. python train.py eval_only --config configs/KITTITrackletsCarsHard.json --eval_epoch 28
    • The results will e.g. be in /home/gross/models/KITTITrackletsCarsHard/val/eval000028/
    • eval.json contains the results when the full angle is evaluated, eval_180.json the evaluation for the predicted angle/flipped angle closest to the ground truth angle
  • To run the evaluation (again) with already computed inference outputs (pred_translations.npy, pred_angles.npy, ...), run e.g. python train.py eval_only --config configs/KITTITrackletsCarsHard.json --eval_epoch 28 --use_old_results
  • To run the evaluation (again) with ICP refinement, run e.g. python train.py eval_only --config configs/KITTITrackletsCarsHard.json --eval_epoch 28 --refineICP --use_old_results
    • The evaluation results are written to a refined_p2p subfolder
  • Some trained models and evaluation results can be found in models_alignnet.zip

License

Our code is released under BSD-3 License (see LICENSE file for details).

References

About

AlignNet-3D: Fast Point Cloud Registration of Partially Observed Objects

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published