Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

OOM on 2x4090? #126

Open
gameveloster opened this issue Oct 20, 2024 · 1 comment
Open

OOM on 2x4090? #126

gameveloster opened this issue Oct 20, 2024 · 1 comment

Comments

@gameveloster
Copy link

gameveloster commented Oct 20, 2024

Getting OOM when using 2x4090.

Trying t2v using save_memory=True, cpu_offloading=False, variant=diffusion_transformer_768p, inference_multigpu=True, bf16.

Is the model expected to fit into 2x4090, total 48 GB VRAM?

@feifeiobama
Copy link
Collaborator

Did you test with scripts/inference_multigpu.sh or scripts/app_multigpu_engine.sh? I think the model should fit into 2x4090 according to #59 (comment).

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants