Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Gamma in the LR scheduler is too small #22

Open
grig-guz opened this issue Aug 1, 2020 · 0 comments
Open

Gamma in the LR scheduler is too small #22

grig-guz opened this issue Aug 1, 2020 · 0 comments

Comments

@grig-guz
Copy link

grig-guz commented Aug 1, 2020

Gamma should be 0.999 and step_size=1, so that the learning rate is decayed by 0.1% as recommended in the paper. Otherwise the learning rate is just cut abruptly after 10k steps.

gamma=0.001)

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant