You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
{{ message }}
This repository has been archived by the owner on Jun 25, 2024. It is now read-only.
Thanks for your brilliant work on such a Keras port of SSD but I have a questions about generator constructed in the training demo of SSD_training.ipynb. In __init__ fucntion
These two attributes specify the number of batches, or number of steps, per epoch. I thinks such a initialization directly make the number of train/validation batches equal to that of train/validation samples. Does that mean the batch size should be one ???
Just my superficial understanding, this perhaps is a problem that significantly prolongs the training process and makes the model suffers overfitting as the validation loss increased along with train going further. If my thinking is wrong, please forgive me. I'm only a tyro in the field of object detection and deep leaning.
Hope you can share your opinion. Thanks for your patience.
The text was updated successfully, but these errors were encountered:
If I understand your question, difference between train/val batch size does not affect to train quality, because val process is only prediction (without weights changing), and val batch size affects only to val phase speed, and depends on your memory size (the more memory you have, the bigger batch size you can use).
Also, in fact, you can use different batch size on different epochs and even on different steps, but it is not common practice. The answer of train batch size sense is in area of SGD, read about this method if want more info, batch size on prediction not about SGD, just about speed and memory usage
# for freeto subscribe to this conversation on GitHub.
Already have an account?
#.
Thanks for your brilliant work on such a Keras port of SSD but I have a questions about
generator
constructed in the training demo ofSSD_training.ipynb
. In__init__
fucntionThese two attributes specify the number of batches, or number of steps, per epoch. I thinks such a initialization directly make the number of train/validation batches equal to that of train/validation samples. Does that mean the batch size should be one ???
Just my superficial understanding, this perhaps is a problem that significantly prolongs the training process and makes the model suffers overfitting as the validation loss increased along with train going further. If my thinking is wrong, please forgive me. I'm only a tyro in the field of object detection and deep leaning.
Hope you can share your opinion. Thanks for your patience.
The text was updated successfully, but these errors were encountered: