Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

support a nice way of changing the learning rate #88

Closed
SobhanMP opened this issue Jun 7, 2022 · 2 comments · Fixed by #89
Closed

support a nice way of changing the learning rate #88

SobhanMP opened this issue Jun 7, 2022 · 2 comments · Fixed by #89

Comments

@SobhanMP
Copy link

SobhanMP commented Jun 7, 2022

It would be nice to have a easy and rule independent way of changing the learning rate like in Flux.jl.

Right now the best way i can think of is making another state tuple while changing the first argument but then changing the learning rate is not rule independent.

@mcabbott suggested

Although perhaps there ought to be one, like st = setup(rule, st) without zero-ing the momenta. Maybe make an issue?

related: #15 (seems to be not active)

@mcabbott
Copy link
Member

mcabbott commented Jun 7, 2022

See what you think of #89. I made it a new verb, although perhaps it could in fact be made a method of setup --- the signature of the first call doesn't really distinguish the two uses, but the state tree contains Leaf or nothing, and never contains new parameters (i.e. things with isnumeric(x)).

@SobhanMP
Copy link
Author

SobhanMP commented Jun 7, 2022

Thanks, that was fast and looks pretty good. I personally don't care but making :eta a parameters might be good idea.

@SobhanMP SobhanMP closed this as completed Jun 7, 2022
@mcabbott mcabbott mentioned this issue Jun 7, 2022
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants