Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

[Advanced dreambooth lora] adjustments to align with canonical script #8406

Merged
merged 33 commits into from
Jun 27, 2024

Conversation

linoytsaban
Copy link
Collaborator

@linoytsaban linoytsaban commented Jun 5, 2024

minor changes to align with canonical script - such as validation logging etc

@HuggingFaceDocBuilderDev

The docs for this PR live here. All of your documentation changes will be reflected on that endpoint. The docs are available until 30 days after the last update.

@@ -571,7 +636,7 @@ def parse_args(input_args=None):
parser.add_argument(
"--optimizer",
type=str,
default="adamW",
default="AdamW",
help=('The optimizer type to use. Choose between ["AdamW", "prodigy"]'),
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

you can set this like:

available_optimizers = ['AdamW', 'prodigy']
parser.add_argument(
...
    choices=available_optimizers,
    help=f'The optimizer type to use. Choose between {available_optimizers}'
)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

which ensures the user doesn't select a nonexistent option

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, +1 to this suggestion. choices greatly simplifies things.

Copy link
Collaborator Author

@linoytsaban linoytsaban Jun 9, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

atm - if a user specifies an optimizer other than Adam/prodigy, we log a warning default to Adam (see below)
so changing to choices would error, I'm not sure we should modify that.

# Optimizer creation
    if not (args.optimizer.lower() == "prodigy" or args.optimizer.lower() == "adamw"):
        logger.warning(
            f"Unsupported choice of optimizer: {args.optimizer}.Supported optimizers include [adamW, prodigy]."
            "Defaulting to adamW"
        )
        args.optimizer = "adamw"

@linoytsaban linoytsaban marked this pull request as ready for review June 8, 2024 14:31
@linoytsaban linoytsaban requested a review from sayakpaul June 8, 2024 14:32
if args.push_to_hub:
repo_id = create_repo(repo_id=model_id, exist_ok=True, token=args.hub_token).repo_id
repo_id = create_repo(
repo_id=args.hub_model_id or Path(args.output_dir).name, exist_ok=True, token=args.hub_token
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If hub_token is not being used let's remove it from the CLI args, altogether.

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hmm I'm not sure I follow? it is still in use

Copy link
Member

@sayakpaul sayakpaul left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Left some minor comments. But the refactor looks very nice to me!

@linoytsaban linoytsaban requested a review from apolinario June 18, 2024 17:18
@apolinario
Copy link
Collaborator

LGTM, just a small nit that I think because of pivotal tuning it may be worth to generate the README.md even without push_to_hub

@linoytsaban linoytsaban requested a review from sayakpaul June 25, 2024 14:21
@apolinario
Copy link
Collaborator

Is the failing test unrelated?

@sayakpaul
Copy link
Member

100 percent not. Let’s merge once this CI run round is done.

@linoytsaban
Copy link
Collaborator Author

linoytsaban commented Jun 26, 2024

ready to go

@sayakpaul sayakpaul merged commit 35f45ec into huggingface:main Jun 27, 2024
8 checks passed
@sayakpaul
Copy link
Member

"real artists ship"

@TornjV
Copy link

TornjV commented Jul 17, 2024

Seems like the changes made in this one broke the fix issued here #6464

sayakpaul added a commit that referenced this pull request Dec 23, 2024
…#8406)

* minor changes

* minor changes

* minor changes

* minor changes

* minor changes

* minor changes

* minor changes

* fix

* fix

* aligning with blora script

* aligning with blora script

* aligning with blora script

* aligning with blora script

* aligning with blora script

* remove prints

* style

* default val

* license

* move save_model_card to outside push_to_hub

* Update train_dreambooth_lora_sdxl_advanced.py

---------

Co-authored-by: Sayak Paul <spsayakpaul@gmail.com>
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

6 participants