Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

RuntimeError: grad can be implicitly created only for scalar outputs #42

Open
ming-ming15 opened this issue Nov 17, 2023 · 0 comments
Open

Comments

@ming-ming15
Copy link

Hi, thanks for your amazing work.
The following error occurs when running the provided bash file:
Traceback (most recent call last):
File "/media/disk8T/zmx/sparseneus/exp_runner_finetune.py", line 609, in
runner.train()
File "/media/disk8T/zmx/sparseneus/exp_runner_finetune.py", line 320, in train
loss.backward()
File "/home/zhumingxia/anaconda3/envs/spneus/lib/python3.10/site-packages/torch/tensor.py", line 488, in backward
torch.autograd.backward(
File "/home/zhumingxia/anaconda3/envs/spneus/lib/python3.10/site-packages/torch/autograd/init.py", line 190, in backward
grad_tensors
= make_grads(tensors, grad_tensors, is_grads_batched=False)
File "/home/zhumingxia/anaconda3/envs/spneus/lib/python3.10/site-packages/torch/autograd/init.py", line 85, in _make_grads
raise RuntimeError("grad can be implicitly created only for scalar outputs")
Thank you in advance.

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant