Skip to content

Deep Surface Normal Guided Depth Prediction for Outdoor Scene from Sparse LiDAR Data and Single Color Image (CVPR 2019)

License

Notifications You must be signed in to change notification settings

JiaxiongQ/DeepLiDAR

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

82 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

DeepLiDAR

This repository contains the code (in PyTorch) for "DeepLiDAR: Deep Surface Normal Guided Depth Prediction for Outdoor Scene from Sparse LiDAR Data and Single Color Image" paper (CVPR 2019) by Jiaxiong Qiu, Zhaopeng Cui, Yinda Zhang, Xingdi Zhang, Shuaicheng Liu, Bing Zeng and Marc Pollefeys.

Introduction

In this work, we propose an end-to-end deep learning system to produce dense depth from sparse LiDAR data and a color image taken from outdoor on-road scenes leveraging surface normal as the intermediate representation. image

Requirements

The Details about Our Synthetic dataset

You could ignore the folders named 'lidar_m', 'normal_s', 'RGBright' and 'Boundary'. (Maybe you also can use these for your own purpose)

'DepthLeft': The raw depth generated by CARLA, you can see the details in here [CARLA](https://carla.readthedocs.io/en/latest/cameras_and_sensors/#sensorcameradepth)
'lidar': The lidar depth which projected into the camera coordinate. Each image is coded by 16-bit like KITTI.
'Lidar64': The raw lidar point cloud information.
'Normal_m': The surface normal calculated from 'DepthLeft'.
'RGBLeft': The RGB images.

Pretrained Model

※NOTE: The pretrained model were saved in '.tar'; however, you don't need to untar it. Use torch.load() to load it.

Download Link

Train

  1. Get the surface normal of Lidar dataset by running the code in the project named 'surface_normal'.
  2. Use the training strategy in the folder named 'trainings'.

Evaluation

  1. Fill the names of the folders in 'test.py':
'gt_fold': the location of your groundtruth folder;
'left_fold': the location of your RGB image folder;
'lidar2_raw': the location of your Sparse(LiDAR) depth folder.
  1. Use the following command to evaluate the trained on your own data.
python test.py --loadmodel (your trained model)

Citation

If you use our code or method in your work, please cite the following:

@InProceedings{Qiu_2019_CVPR,
author = {Qiu, Jiaxiong and Cui, Zhaopeng and Zhang, Yinda and Zhang, Xingdi and Liu, Shuaicheng and Zeng, Bing and Pollefeys, Marc},
title = {DeepLiDAR: Deep Surface Normal Guided Depth Prediction for Outdoor Scene From Sparse LiDAR Data and Single Color Image},
booktitle = {The IEEE Conference on Computer Vision and Pattern Recognition (CVPR)},
month = {June},
year = {2019}
}

Please direct any questions to Jiaxiong Qiu at qiujiaxiong727@gmail.com

About

Deep Surface Normal Guided Depth Prediction for Outdoor Scene from Sparse LiDAR Data and Single Color Image (CVPR 2019)

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages