-
Notifications
You must be signed in to change notification settings - Fork 27k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Add tests for no_trainer and fix existing examples #16656
Conversation
The documentation is not available anymore as the PR was closed or merged. |
CI failures were fixed by removing: if torch_device != "cuda":
testargs.append("--no_cuda") from From what I could see they were unused, so I didn't duplicate them from the transformers tests. Let me know if they should be added back in, with special behavior on those tests 😄 |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Great work! 😍
For information, here are the durations:
Could the |
Changed checkpointing tests to be by epoch, and also not saving with swag. Here were those times locally for me: Before
After
|
* Fixed some bugs involving saving during epochs * Added tests mimicking the existing examples tests * Added in json exporting to all `no_trainer` examples for consistency
New tests for the
no_trainer
scriptsWhat does this add?
no_trainer
scripts, mocking how the Transformers counterparts workno_trainer
scripts, discovered while writing these tests