-
Notifications
You must be signed in to change notification settings - Fork 5.8k
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
[Advanced dreambooth lora] adjustments to align with canonical script #8406
Conversation
The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update. |
examples/advanced_diffusion_training/train_dreambooth_lora_sdxl_advanced.py
Outdated
Show resolved
Hide resolved
@@ -571,7 +636,7 @@ def parse_args(input_args=None): | |||
parser.add_argument( | |||
"--optimizer", | |||
type=str, | |||
default="adamW", | |||
default="AdamW", | |||
help=('The optimizer type to use. Choose between ["AdamW", "prodigy"]'), |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
you can set this like:
available_optimizers = ['AdamW', 'prodigy']
parser.add_argument(
...
choices=available_optimizers,
help=f'The optimizer type to use. Choose between {available_optimizers}'
)
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
which ensures the user doesn't select a nonexistent option
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Yes, +1 to this suggestion. choices
greatly simplifies things.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
atm - if a user specifies an optimizer other than Adam/prodigy, we log a warning default to Adam (see below)
so changing to choices would error, I'm not sure we should modify that.
# Optimizer creation
if not (args.optimizer.lower() == "prodigy" or args.optimizer.lower() == "adamw"):
logger.warning(
f"Unsupported choice of optimizer: {args.optimizer}.Supported optimizers include [adamW, prodigy]."
"Defaulting to adamW"
)
args.optimizer = "adamw"
examples/advanced_diffusion_training/train_dreambooth_lora_sdxl_advanced.py
Show resolved
Hide resolved
examples/advanced_diffusion_training/train_dreambooth_lora_sdxl_advanced.py
Show resolved
Hide resolved
examples/advanced_diffusion_training/train_dreambooth_lora_sdxl_advanced.py
Show resolved
Hide resolved
examples/advanced_diffusion_training/train_dreambooth_lora_sdxl_advanced.py
Outdated
Show resolved
Hide resolved
if args.push_to_hub: | ||
repo_id = create_repo(repo_id=model_id, exist_ok=True, token=args.hub_token).repo_id | ||
repo_id = create_repo( | ||
repo_id=args.hub_model_id or Path(args.output_dir).name, exist_ok=True, token=args.hub_token |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If hub_token
is not being used let's remove it from the CLI args, altogether.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Hmm I'm not sure I follow? it is still in use
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Left some minor comments. But the refactor looks very nice to me!
LGTM, just a small nit that I think because of pivotal tuning it may be worth to generate the README.md even without push_to_hub |
examples/advanced_diffusion_training/train_dreambooth_lora_sdxl_advanced.py
Outdated
Show resolved
Hide resolved
Is the failing test unrelated? |
100 percent not. Let’s merge once this CI run round is done. |
ready to go |
"real artists ship" |
Seems like the changes made in this one broke the fix issued here #6464 |
…#8406) * minor changes * minor changes * minor changes * minor changes * minor changes * minor changes * minor changes * fix * fix * aligning with blora script * aligning with blora script * aligning with blora script * aligning with blora script * aligning with blora script * remove prints * style * default val * license * move save_model_card to outside push_to_hub * Update train_dreambooth_lora_sdxl_advanced.py --------- Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
minor changes to align with canonical script - such as validation logging etc