Skip to content

PEFT LoRA not working with xformers. #5504

Closed
@AnyISalIn

Description

@AnyISalIn

Describe the bug

PEFT LoRA not working with xformers.

Reproduction

from diffusers import DiffusionPipeline
import torch

pipe = DiffusionPipeline.from_pretrained("./models/checkpoint/stable-diffusion-xl-base-1.0", torch_dtype=torch.float16)
pipe.load_lora_weights("CiroN2022/toy-face", weight_name="toy_face_sdxl.safetensors", adapter_name="toy")
pipe.to("cuda")
pipe.enable_xformers_memory_efficient_attention()
res = pipe(prompt="1girl", num_inference_steps=20)
   1522 # If we don't have any hooks, we want to skip the rest of the logic in
   1523 # this function, and just call forward.
   1524 if not (self._backward_hooks or self._backward_pre_hooks or self._forward_hooks or self._forward_pre_hooks
   1525         or _global_backward_pre_hooks or _global_backward_hooks
   1526         or _global_forward_hooks or _global_forward_pre_hooks):
-> 1527     return forward_call(*args, **kwargs)
   1529 try:
   1530     result = None

TypeError: Linear.forward() got an unexpected keyword argument 'scale'

Logs

No response

System Info

- `diffusers` version: 0.22.0.dev0
- Platform: Linux-5.15.0-76-generic-x86_64-with-glibc2.35
- Python version: 3.10.13
- PyTorch version (GPU?): 2.1.0+cu121 (True)
- Huggingface_hub version: 0.17.3
- Transformers version: 4.34.0
- Accelerate version: 0.23.0
- xFormers version: 0.0.22.post4
- Using GPU in script?: <fill in>
- Using distributed or parallel set-up in script?: <fill in>

Who can help?

No response

Metadata

Metadata

Assignees

No one assigned

    Labels

    bugSomething isn't working

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions