-
-
Notifications
You must be signed in to change notification settings - Fork 613
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Unnecessary typeassert for ADAGrad #1410
Comments
The asserts were ostensibly added in the name of stability (see #1388), but it seems like they're not general enough. I'm not sure what a good solution would be here given the limitations of the current optimizer interface, @simeonschaub do you have any thoughts? |
let's just remove the type assertions, the performance impact should be negligible |
I don't know how bad it would impact the inference but a possible fix would be to replace |
Ah I see the problem.
Generally, inferring something to a non-concrete type is not that useful for inference, if it's not a small union, so I would be surprised if that made much of a difference over just deleting the type annotation. It might also not necessarily be correct still, since there's no guarantee that A possible solution, that could be beneficial over removing this annotation altogether would be to move all broadcasted calls after the call to |
Can we check if the inference issue persists in optimisers.jl? If not, maybe we prioritize moving to it. There was a 20% hit in my testing, but it might be worth it in the long run. |
That's what https://github.com/FluxML/Optimisers.jl is trying to achieve, hence my comment on "the current optimizer interface" :) Edit: Dhairya beat me to the punch! |
Unless someone is actively working on the function barrier approach, I suggest we fix this issue by removing the type assert for the time being, since it can be quite invalidating as testified by the fact that there are 2 similar issues (in fact, we should add some tests when closing this) |
So this is probably a copy of #832 and #816. But I don't think the arguments apply in my case.
When using
apply!
with ADAGrad, there is the lineFlux.jl/src/optimise/optimisers.jl
Line 348 in 075f42b
The type assertion assumes that it should be of the same type as
x
and it's an issue.I am working with views therefore the type of
x
isSubArray{...}
, howeverzero(x)
will of course return anArray{...}
and the resulting output will not pass the type assertion.The text was updated successfully, but these errors were encountered: