An implementation of Learning Sparse Neural Networks through
-
LeNet5 with MNIST w/ and w/o L0 norm
python mnist.py [--baseline]
.--baseline
is for without L0 norm (original LeNet5).- LeNet5 with L0 regularization achieves 0.9% validation error as mentioned in the paper.
-
an interactive explanation about the hard concrete distribution
- PyTorch 0.3/ torchvision 0.2
- tensorboard-pytorch
- tqdm
Not yet strictly measure how sparse the L0 regularized model is, but show histograms of the first convolutional layers' weights.
- with L0 normalization
- without L0 normalization
- Regularization for biases (currently only weights are regularized).
- More complex architectures with L0 Norm.