Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

samples_per_epoch #36

Open
JiangThea opened this issue Dec 6, 2024 · 0 comments
Open

samples_per_epoch #36

JiangThea opened this issue Dec 6, 2024 · 0 comments

Comments

@JiangThea
Copy link

I appreciate the insightful work done by the authors on this paper. I have a question regarding the data processing described. According to the paper, during the training phase: there are 250,000 iterations, 50 epochs, and a batch size of 32, which implies that each epoch consists of 8,000,000 images. For the pre-training phase: there are 60,000 iterations, 300 epochs, and a batch size of 32, suggesting each epoch consists of 1,920,000 images. However, in the code, samples_per_epoch is set to 50,000, meaning each epoch only contains 50,000/32 iterations. Could you clarify which scenario reflects the actual basis for your training process?

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant