-
-
Notifications
You must be signed in to change notification settings - Fork 127
use vmap for all activations #221
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
base: master
Are you sure you want to change the base?
Conversation
could you update the OP with the benchmark result for your system? |
we are missing |
@@ -118,7 +119,7 @@ elu(x::RealOrFloatType, α = one(x)) = vifelse(x ≥ 0, x / one(x), α * (exp(x) | |||
activation function. | |||
""" | |||
function gelu(x::RealOrFloatType) | |||
p = oftype(x / 1, π) | |||
p = oftype(x / 1, Float64(π)) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is this hardcoding the type here?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Zygote willl fail for oftype(..., ::Irrational)
Needs testing against Zygote and CUDA to make sure we don't break any dispatch that we are relying on. |
Zygote is already tested against here. But testing against CUDA may be out of the scope of NNlib? |
Do we need to update the lower bound for the LoopVectorization version in |
given #224 and the discussion in FluxML/Flux.jl#1272, I no longer think this is the correct way forward. The whole vectorization logic should live in Flux's layers definitions, and we should revert NNlib to its pre-LoopVectorization state |
We should revert the vectorisation stuff and release a patch that drops the packages from dependencies. |
Can you also add the same benchmarks using a simple
|
fix #220