This repository contains code accompanying the paper Training Generative Reversible Networks.
- Install Pytorch 0.4.
- Install other dependencies such as numpy, matplotlib, seaborn, etc.
- Add this repository to your PYTHONPATH
To reproduce the CelebA results, first run the notebooks under notebooks/celeba:
Running each for 250 epochs should definitely be enough to get similar results as the ones in notebooks/celeba/Plots.ipynb.
To reproduce the MNIST results, run the notebook under notebooks/mnist:
You should get plots similar to the ones in notebooks/mnist/Plots.ipynb. Latent dimensions are arguably a bit less meaningful than in the paper, this setup could certainly be further optimized and stabilized, feel free to contact me if you are interested to discuss it.
If you use this code in a scientific publication, please cite us as:
@inproceedings{schirrm_revnet_2018,
author = {Schirrmeister, Robin Tibor and Chrabąszcz, Patryk and Hutter,
Frank and Ball, Tonio},
title = {Training Generative Reversible Networks},
url = {https://arxiv.org/abs/1806.01610},
booktitle = {ICML 2018 workshop on Theoretical Foundations and Applications of Deep Generative Models},
month = {jul},
year = {2018},
keywords = {Generative Models, Reversible Networks, Autoencoders},
}