Skip to content

Commit b2615ac

Browse files
WoosukKwonprashantgupta24
authored andcommitted
[Bugfix] Fix pin_lora error in TPU executor (vllm-project#5760)
1 parent 8d15063 commit b2615ac

File tree

1 file changed

+3
-0
lines changed

1 file changed

+3
-0
lines changed

vllm/executor/tpu_executor.py

+3
Original file line numberDiff line numberDiff line change
@@ -82,6 +82,9 @@ def add_lora(self, lora_request: LoRARequest) -> bool:
8282
def remove_lora(self, lora_id: int) -> bool:
8383
raise NotImplementedError("LoRA is not implemented for TPU backend.")
8484

85+
def pin_lora(self, lora_id: int) -> bool:
86+
raise NotImplementedError("LoRA is not implemented for TPU backend.")
87+
8588
def list_loras(self) -> Set[int]:
8689
raise NotImplementedError("LoRA is not implemented for TPU backend.")
8790

0 commit comments

Comments
 (0)