Skip to content

schelotto/Neural_Speed_Reading_via_Skim-RNN_PyTorch

Repository files navigation

Introduction

This is a PyTorch implementation of Neural Speed Reading via Skim-RNN published on ICLR 2018. Skim RNN

The imdb dataset is used by default and stored in the ./data folder. Besides, the 300 dimensional GloVe word embedding trained under 840 billion words is used.

Unlike Skip RNN or Jump LSTM where the objective is discrete, Skim RNN introduces the Gumbel-softmax parametrization trick that makes the skimming objective differentiable:

Gumbel-Softmax

Usage

python main.py [arguments]

Arguments

-h, help                    help
-large_cell_size            size of the large LSTM
-small_cell_size            size ofthe small LSTM

About

PyTorch implementation of "Neural Speed Reading via Skim-RNN"

Topics

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages