You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Did you test with scripts/inference_multigpu.sh or scripts/app_multigpu_engine.sh? I think the model should fit into 2x4090 according to #59 (comment).
Getting OOM when using 2x4090.
Trying
t2v
usingsave_memory=True
,cpu_offloading=False
,variant=diffusion_transformer_768p
,inference_multigpu=True
,bf16
.Is the model expected to fit into 2x4090, total 48 GB VRAM?
The text was updated successfully, but these errors were encountered: