You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Due to TensorFlow's compilation issues, the second loop attempts to use the first initialize object of the optimizer in the __init__ function as the loss return NaN value (Even if the new model object is declared within the loop, the first initialize optimizer is attempted)
To resolve this issue, do not utilize the optimizer's initial value as provided by the __init__ function; instead, the optimizer must be assigned in each and every fold as in the example below:
for fold in range(1, 6):
# the optimizer must be assigned in each and every fold
optimizer = Adam(beta_1=0.9, beta_2=0.999, epsilon=1e-08)
model = MIN2Net(optimizer =optimizer)
Due to TensorFlow's compilation issues, the second loop attempts to use the first initialize object of the optimizer in the
__init__
function as the loss returnNaN
value (Even if the new model object is declared within the loop, the first initialize optimizer is attempted)To resolve this issue, do not utilize the optimizer's initial value as provided by the
__init__
function; instead, the optimizer must be assigned in each and every fold as in the example below:However, this issue does not affect the experimental results of our paper since we declared a new optimizer in every loop https://github.com/IoBT-VISTEC/MIN2Net/blob/main/experiments/train_MIN2Net_k-fold-CV.py#L41
The text was updated successfully, but these errors were encountered: