Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Opimizer issue (found loss = NaN in second fold) #1

Open
xydxdy opened this issue May 18, 2022 · 0 comments
Open

Opimizer issue (found loss = NaN in second fold) #1

xydxdy opened this issue May 18, 2022 · 0 comments
Labels
bug Something isn't working

Comments

@xydxdy
Copy link
Collaborator

xydxdy commented May 18, 2022

Due to TensorFlow's compilation issues, the second loop attempts to use the first initialize object of the optimizer in the __init__ function as the loss return NaN value (Even if the new model object is declared within the loop, the first initialize optimizer is attempted)

To resolve this issue, do not utilize the optimizer's initial value as provided by the __init__ function; instead, the optimizer must be assigned in each and every fold as in the example below:

for fold in range(1, 6):
    # the optimizer must be assigned in each and every fold
    optimizer = Adam(beta_1=0.9, beta_2=0.999, epsilon=1e-08)
    model = MIN2Net(optimizer =optimizer)

However, this issue does not affect the experimental results of our paper since we declared a new optimizer in every loop https://github.com/IoBT-VISTEC/MIN2Net/blob/main/experiments/train_MIN2Net_k-fold-CV.py#L41

@xydxdy xydxdy added the bug Something isn't working label May 18, 2022
@xydxdy xydxdy changed the title Opimizer isssue (found loss = NaN in second fold) Opimizer issue (found loss = NaN in second fold) May 18, 2022
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
bug Something isn't working
Projects
None yet
Development

No branches or pull requests

1 participant