Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Clearify using num_layers as n in LSTM implementation #43

Open
phisad opened this issue Mar 13, 2019 · 0 comments
Open

Clearify using num_layers as n in LSTM implementation #43

phisad opened this issue Mar 13, 2019 · 0 comments

Comments

@phisad
Copy link

phisad commented Mar 13, 2019

Hello,

I try to re-implement your paper in Keras. Now, I'm struggling with your LSTM implementation.

You use num_layers as n for the LSTM initialization, but the num_layers should be the depth of the LSTM. Nevertheless, in the LSTM implementation it seems to be used as the number of timesteps L. Is this true?

self.core = LSTM.lstm(self.rnn_size, self.rnn_size, self.num_layers, dropout)

for L = 1,n do

Furthermore, there is createClones which creates multiple weights for each timestep as it seems. Is this supposed to be wanted as an LSTM should share the same weights through time or a Bug?

for t=1,self.seq_length do

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant