This is the official implementation of the paper "Preconditioners for the Stochastic Training of Neural fields".
Shin-Fang Chng*,
Hemanth Saratchandran*,
Simon Lucey
Australian Institute for Machine Learning (AIML), University of Adelaide, * denotes equal contribution
## Clone the repo
git clone https://github.com/sfchng/preconditioner_neural_fields.git
cd preconditioner_neural_fields
conda create -n precond_nf python=3.9
conda activate precond_nf
pip install torch==1.12.1+cu113 torchvision==0.13.1+cu113 torchaudio==0.12.1 --extra-index-url https://download.pytorch.org/whl/cu113
pip install -r requirements.txt
python -m pip install libigl
We use the div2k
dataset for our 2d image experiment. Please download the dataset here, and place it under
the directory data/images
.
We use the stanford
dataset for our 3d binary occupancy experiment. Please download the dataset here, and place it under the directory data/bocc
.
# Image experiment
./scripts/neural_image.sh
# Binary occupancy experiment
./scripts/neural_bocc.sh
ESGD (a Curvature-aware preconditioned gradient descent algorithm) improves convergence for Gaussian, sine and wavelet activations, while Adam performs better for ReLU network with positional encoding (ReLU(PE)). We provide training convergence for a 2D image reconstruction task as an example below
The code here is the implementation of the following publication. If you use this software package, please cite us@article{chng2024preconditioners,
title={Preconditioners for the stochastic training of implicit neural representations},
author={Chng, Shin-Fang and Saratchandran, Hemanth and Lucey, Simon},
journal={arXiv preprint arXiv:2402.08784},
year={2024}
}
We sincerely thank the owners of the following open source projects, which are used by our released codes: BARF, Siren, Wire, SAPE, ESGD, AdaHessian, KFAC, Shampoo