Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Fix regression with FLUX diffusers LoRA models #6997

Merged
merged 2 commits into from
Oct 1, 2024

Conversation

RyanJDick
Copy link
Collaborator

Summary

Fixes a regression in support for FLUX diffusers LoRA models (introduced in #6967).

Related Issues / Discussions

Closes #6996

QA Instructions

I ran the following manual tests:

  • Non-quantized transformer
    • diffusers LoRA
    • kohya LoRA with transformer layers only
    • kohya LoRA with transformer layers and CLIP layers
  • Quantized tranformer (BnB NF4)
    • diffusers LoRA
    • kohya LoRA with transformer layers only
    • kohya LoRA with transformer layers and CLIP layers

Checklist

  • The PR has a short but descriptive title, suitable for a changelog
  • Tests added / updated (if applicable)
  • Documentation added / updated (if applicable)

@github-actions github-actions bot added python PRs that change python files invocations PRs that change invocations backend PRs that change backend files python-tests PRs that change python tests labels Oct 1, 2024
@hipsterusername hipsterusername enabled auto-merge (rebase) October 1, 2024 14:18
@hipsterusername hipsterusername merged commit 807f458 into main Oct 1, 2024
14 checks passed
@hipsterusername hipsterusername deleted the ryan/fix-flux-diffusers-lora branch October 1, 2024 14:22
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
backend PRs that change backend files invocations PRs that change invocations python PRs that change python files python-tests PRs that change python tests
Projects
None yet
Development

Successfully merging this pull request may close these issues.

[bug]: v5.0.1 appears to have broken Flux LoRAs that worked in v5.0.0
2 participants