Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Error during training #18

Open
sushantakpani opened this issue Mar 16, 2020 · 4 comments
Open

Error during training #18

sushantakpani opened this issue Mar 16, 2020 · 4 comments

Comments

@sushantakpani
Copy link

I got this error while running
python coref.py

loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes), dim=1).clamp_(eps, 1-eps), dim=0) * -1)
TypeError: log() got an unexpected keyword argument 'dim'

@sushantakpani
Copy link
Author

I have removed dim parameter in log()
#loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes), dim=1).clamp_(eps, 1-eps), dim=0) * -1)
loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes)).clamp_(eps, 1-eps)) * -1)

Is it correct approach ?

@sushantakpani
Copy link
Author

sushantakpani commented Mar 16, 2020

There is no dim parameter in torch.log()
(https://pytorch.org/docs/stable/torch.html#torch.log)

torch.log(input, out=None) → Tensor
Returns a new tensor with the natural logarithm of the elements of input.
yi=loge (xi)

Parameters
input (Tensor) – the input tensor.
out (Tensor, optional) – the output tensor.

@troublemaker-r
Copy link

I got this error while running
python coref.py

loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes), dim=1).clamp_(eps, 1-eps), dim=0) * -1)
TypeError: log() got an unexpected keyword argument 'dim'

I meet the same error,were you able to solve this error?

@sushantakpani
Copy link
Author

It seems this is the correct way
loss = torch.sum(torch.log(torch.sum(torch.mul(probs, gold_indexes), dim=1).clamp_(eps, 1-eps)), dim=0) * -1

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants