Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

seq2seq model generates always the same output #39

Open
LucaSCostanzo opened this issue Nov 25, 2020 · 0 comments
Open

seq2seq model generates always the same output #39

LucaSCostanzo opened this issue Nov 25, 2020 · 0 comments

Comments

@LucaSCostanzo
Copy link

Hi,
I am pretty new to the world of NLP. I trained the encoder decoder model as described in chap 10 of the book, but the network always produces the same output, independently from the sentence in input. I tried with several combinations for the hyperparameters (learning rate, epochs, number of LSTM hidden neurons, batch, ecc), but nothing worked. Any suggestion?

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant