Skip to content

Commit e078b90

Browse files
committed
Fix
Signed-off-by: Woosuk Kwon <woosuk.kwon@berkeley.edu>
1 parent 6b88f1f commit e078b90

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

vllm/v1/attention/backends/flash_attn.py

+1-1
Original file line numberDiff line numberDiff line change
@@ -85,7 +85,7 @@ def __init__(
8585
if sliding_window is None:
8686
self.sliding_window = (-1, -1)
8787
else:
88-
self.sliding_window = ((sliding_window - 1, 0))
88+
self.sliding_window = (sliding_window - 1, 0)
8989
self.kv_cache_dtype = kv_cache_dtype
9090
if logits_soft_cap is None:
9191
# In flash-attn, setting logits_soft_cap as 0 means no soft cap.

0 commit comments

Comments
 (0)