Skip to content

CVPR 2025: Preconditioners for the Stochastic Training of Neural fields

Notifications You must be signed in to change notification settings

sfchng/preconditioner_neural_fields

Repository files navigation

Preconditioners for the Stochastic Training of Neural fields (CVPR 2025)


This is the official implementation of the paper "Preconditioners for the Stochastic Training of Neural fields".

Shin-Fang Chng*, Hemanth Saratchandran*, Simon Lucey
Australian Institute for Machine Learning (AIML), University of Adelaide, * denotes equal contribution

Getting Started

Installation

## Clone the repo
git clone https://github.com/sfchng/preconditioner_neural_fields.git
cd preconditioner_neural_fields

Setup Conda Environment

conda create -n precond_nf python=3.9 
conda activate precond_nf
pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113
pip install -r requirements.txt
python -m pip install libigl

Data

Div2k data

We use the div2k dataset for our 2d image experiment. Please download the dataset here, and place it under the directory data/images.

Stanford data

We use the stanford dataset for our 3d binary occupancy experiment. Please download the dataset here, and place it under the directory data/bocc.

Running this repo

# Image experiment
./scripts/neural_image.sh
# Binary occupancy experiment
./scripts/neural_bocc.sh

Key results

ESGD (a Curvature-aware preconditioned gradient descent algorithm) improves convergence for Gaussian, sine and wavelet activations, while Adam performs better for ReLU network with positional encoding (ReLU(PE)). We provide training convergence for a 2D image reconstruction task as an example below

📖 Citation

The code here is the implementation of the following publication. If you use this software package, please cite us
@article{chng2024preconditioners,
  title={Preconditioners for the stochastic training of implicit neural representations},
  author={Chng, Shin-Fang and Saratchandran, Hemanth and Lucey, Simon},
  journal={arXiv preprint arXiv:2402.08784},
  year={2024}
}

🤝 Acknowledgement

We sincerely thank the owners of the following open source projects, which are used by our released codes: BARF, Siren, Wire, SAPE, ESGD, AdaHessian, KFAC, Shampoo

About

CVPR 2025: Preconditioners for the Stochastic Training of Neural fields

Topics

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published