Skip to content

Latest commit

 

History

History
43 lines (31 loc) · 2.08 KB

README.rst

File metadata and controls

43 lines (31 loc) · 2.08 KB

Training Generative Reversible Networks

This repository contains code accompanying the paper Training Generative Reversible Networks.

Installation

  1. Install Pytorch 0.4.
  2. Install other dependencies such as numpy, matplotlib, seaborn, etc.
  3. Add this repository to your PYTHONPATH

Run

To reproduce the CelebA results, first run the notebooks under notebooks/celeba:

  1. Only_Clamp.ipynb
  2. Continue_Adversarial.ipynb

Running each for 250 epochs should definitely be enough to get similar results as the ones in notebooks/celeba/Plots.ipynb.

To reproduce the MNIST results, run the notebook under notebooks/mnist:

  1. OptimalTransport.ipynb

You should get plots similar to the ones in notebooks/mnist/Plots.ipynb. Latent dimensions are arguably a bit less meaningful than in the paper, this setup could certainly be further optimized and stabilized, feel free to contact me if you are interested to discuss it.

Citing

If you use this code in a scientific publication, please cite us as:

@inproceedings{schirrm_revnet_2018,
author = {Schirrmeister, Robin Tibor and Chrabąszcz, Patryk and Hutter,
  Frank and Ball, Tonio},
title = {Training Generative Reversible Networks},
url = {https://arxiv.org/abs/1806.01610},
booktitle = {ICML 2018 workshop on Theoretical Foundations and Applications of Deep Generative Models},
month = {jul},
year = {2018},
keywords = {Generative Models, Reversible Networks, Autoencoders},
}