You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Not sure if this should be solved here, in cholespy, or nanobind. The from_differential function throws an error if the second argument is a torch.nn.Parameter rather than a tensor. Parameter is directly derived from Tensor, so there's no reason the cast should fail.
TypeError: solve(): incompatible function arguments. The following argument types are supported:
1. solve(self, b: tensor[dtype=float32, order='C'], x: tensor[dtype=float32, order='C']) -> None
Invoked with types: CholeskySolverF, Parameter, Tensor
It's quite hard to workaround this "from the outside". E.g. doing from_differential(M, x.data) doesn't work because the gradient will be written to x.data.grad whereas the optimizer expects x.grad.
The text was updated successfully, but these errors were encountered:
Not sure if this should be solved here, in cholespy, or nanobind. The
from_differential
function throws an error if the second argument is atorch.nn.Parameter
rather than a tensor. Parameter is directly derived from Tensor, so there's no reason the cast should fail.It's quite hard to workaround this "from the outside". E.g. doing
from_differential(M, x.data)
doesn't work because the gradient will be written tox.data.grad
whereas the optimizer expectsx.grad
.The text was updated successfully, but these errors were encountered: