Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Command line argument of learning rate scheduler #128

Open
wants to merge 1 commit into
base: master
Choose a base branch
from

Conversation

hlh981029
Copy link
Collaborator

@CLAassistant
Copy link

CLAassistant commented Jul 2, 2022

CLA assistant check
All committers have signed the CLA.

@hlh981029
Copy link
Collaborator Author

if self.args.use_amp:
self._scaler.step(self._optimizer)
self._scaler.update()
else:
if self._lr_scheduler:
# If use AdamW, it should explicit perform self._optimizer.step()
# If not, it means use bertadam, which consists of inherent schedular.
self._optimizer.step()
if self._lr_scheduler:
self._lr_scheduler.step()

这里的逻辑我不太理解,比如,self._optimizerBertAdam 的时候 self._lr_schedulerNone,这里是需要调用 self._optimizer.step() 的。

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants