Skip to content

DynamicCity: Large-Scale LiDAR Generation from Dynamic Scenes

Notifications You must be signed in to change notification settings

dynamic-city/DynamicCity

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

7 Commits
 
 

Repository files navigation

DynamicCity: Large-Scale LiDAR Generation from Dynamic Scenes

Teaser

LiDAR scene generation has been developing rapidly recently. However, existing methods primarily focus on generating static and single-frame scenes, overlooking the inherently dynamic nature of real-world driving environments. In this work, we introduce DynamicCity, a novel 4D LiDAR generation framework capable of generating large-scale, high-quality LiDAR scenes that capture the temporal evolution of dynamic environments. DynamicCity mainly consists of two key models: 1. A VAE model for learning HexPlane as the compact 4D representation. Instead of using naive averaging operations, DynamicCity employs a novel Projection Module to effectively compress 4D LiDAR features into six 2D feature maps for HexPlane construction, which significantly enhances HexPlane fitting quality (up to 12.56 mIoU gain). Furthermore, we utilize an Expansion & Squeeze Strategy to reconstruct 3D feature volumes in parallel, which improves both network training efficiency and reconstruction accuracy than naively querying each 3D point (up to 7.05 mIoU gain, 2.06x training speedup, and 70.84% memory reduction). 2. A DiT-based diffusion model for HexPlane generation. To make HexPlane feasible for DiT generation, a Padded Rollout Operation is proposed to reorganize all six feature planes of the HexPlane as a squared 2D feature map. In particular, various conditions could be introduced in the diffusion or sampling process, supporting versatile 4D generation applications, such as trajectory- and command-driven generation, inpainting, and layout-conditioned generation. Extensive experiments on the CarlaSC and Waymo datasets demonstrate that DynamicCity significantly outperforms existing state-of-the-art 4D LiDAR generation methods across multiple metrics. The code will be released to facilitate future research.

Overview

Overview

Our DynamicCity framework consists of two key procedures: (a) Encoding HexPlane with an VAE architecture, and (b) 4D Scene Generation with HexPlane DiT.

Updates

  • [October 2024]: Project page released.

Outline

⚙️ Installation

Kindly refer to INSTALL.md for the installation details.

♨️ Data Preparation

Kindly refer to DATA_PREPARE.md for the details to prepare the CarlaSC, Occ3D-Waymo, and Occ3D-nuScenes datasets.

🚀 Getting Started

Kindly refer to GET_STARTED.md to learn more about how to use this codebase.

🏙️ Dynamic Scene Generation

Unconditional Generation

Unconditional Generation 1 Unconditional Generation 2 Unconditional Generation 3

HexPlane Conditional Generation

Unconditional Generation 1 Unconditional Generation 2 Unconditional Generation 3

Command & Trajectory-Driven Generation

Unconditional Generation 1 Unconditional Generation 2 Unconditional Generation 3

Layout-Conditioned Generation

Unconditional Generation 1 Unconditional Generation 2

Dynamic Scene Inpainting

Unconditional Generation 1 Unconditional Generation 2

📝 TODO List

  • Release code
  • Release model weights and pretrained checkpoints

About

DynamicCity: Large-Scale LiDAR Generation from Dynamic Scenes

Topics

Resources

Stars

Watchers

Forks