You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Thank you for sharing your nice torch source code.
In the paper "Recurrent Batch Normalization", authors use the running (or population) means and variance at each time step (until T_max), separately.
However, I couldn't find these separate running means and variances calculation in your code.
First, bnlstm constructed by nngraph using nn.BatchNormalization().
After bnlstm construction, clones of module are made by model_utils.clone_many_times().
As far as my understanding, in this situation, the sample means and variances are calculated at each time step, but the running means and variances are shared over time by model_utils.clone_many_times().
Could you kindly explain which part am I misunderstanding?
The text was updated successfully, but these errors were encountered:
I am also wondering the same question here. It seems you just do the normal BN initialization and usage. There is no code related to how separated mean and var are computed for different iterations? I don't think torch could automatically do this.
Thank you for sharing your nice torch source code.
In the paper "Recurrent Batch Normalization", authors use the running (or population) means and variance at each time step (until T_max), separately.
However, I couldn't find these separate running means and variances calculation in your code.
First, bnlstm constructed by nngraph using nn.BatchNormalization().
After bnlstm construction, clones of module are made by model_utils.clone_many_times().
As far as my understanding, in this situation, the sample means and variances are calculated at each time step, but the running means and variances are shared over time by model_utils.clone_many_times().
Could you kindly explain which part am I misunderstanding?
The text was updated successfully, but these errors were encountered: