Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Training on multiple gpus #34

Open
chenzhik opened this issue Jan 8, 2023 · 1 comment
Open

Training on multiple gpus #34

chenzhik opened this issue Jan 8, 2023 · 1 comment

Comments

@chenzhik
Copy link

chenzhik commented Jan 8, 2023

Dear Authors,

I noticed that you train the model for completion "using the Adam optimizer with initial learning rate 1e−4
(decayed by 0.7 every 40 epochs) and batch size 32 by NVIDIA TITAN Xp GPU" mentioned in your paper. Did you mean that you used a single GPU for training or more GPUs?

Thanks for your work and look forward to your favourable reply.

@kobeees
Copy link

kobeees commented Jul 24, 2023

Hello, I ran the author's official code to complete the task at 8192 resolution. Found that it takes only 20 minutes to train an epoch, but verification takes 2 hours on an NVIDIA 3090 GPU. Have you ever encountered this situation? Thank you!

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants