Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Seems that "Softmax" should be used instead of "Sigmoid"? #5

Open
ABC67876 opened this issue Jul 23, 2023 · 0 comments
Open

Seems that "Softmax" should be used instead of "Sigmoid"? #5

ABC67876 opened this issue Jul 23, 2023 · 0 comments

Comments

@ABC67876
Copy link

ABC67876 commented Jul 23, 2023

Hello, thanks for your implementation. However, I found that the "probs" for binary classification doesn't sum up to be 1.0.


        prediction = model.forward(image.float())
        loss = torch.nn.BCEWithLogitsLoss(weight=weight)(prediction, label)
        loss.backward()
        optimizer.step()

        loss_value = loss.item()
        losses.append(loss_value)

        probas = torch.sigmoid(prediction)

        y_trues.append(int(label[0][1]))
        y_preds.append(probas[0][1].item())

The issue mentioned is located in "https://github.com/ahmedbesbes/mrnet/blob/master/train.py"

@ABC67876 ABC67876 changed the title Seems that "Softmax" should be used instead of " Seems that "Softmax" should be used instead of "Sigmoid"? Jul 23, 2023
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant