Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Quadrature.jl don't support GPU with batch != 0 #61

Open
KirillZubov opened this issue Mar 18, 2021 · 1 comment
Open

Quadrature.jl don't support GPU with batch != 0 #61

KirillZubov opened this issue Mar 18, 2021 · 1 comment

Comments

@KirillZubov
Copy link
Member

While there is no required algorithms that would support vectorization (batch! = 0) and work with GPU

CUDA.allowscalar(false)
chain = FastChain(FastDense(2,inner,Flux.σ),
                  FastDense(inner,inner,Flux.σ),
                  FastDense(inner,inner,Flux.σ),
                  FastDense(inner,inner,Flux.σ),
                  FastDense(inner,1))
initθ = initial_params(chain) |>gpu
x_ = rand(2,10)|> gpu
chain(x_, initθ)


function g(x,p) 
  x = adapt(typeof(p),x)
  sum(abs2,chain(x, p), dims=1)
end

lb =[1.0,1.0]
ub = [3.0,3.0]
p = [2.0, 3.0, 4.0] |>gpu
prob = QuadratureProblem(g,lb,ub,p)

function testf3(p, p_)
    prob = QuadratureProblem(g,lb,ub,p, batch = 100, nout=1)
    solve(prob, CubatureJLp(); reltol=1e-3,abstol=1e-3)[1]
end

testf3(p, nothing)

scalar getindex is disallowed

Stacktrace:
 [1] #17 at /root/.julia/packages/Cubature/5zwuu/src/Cubature.jl:215 [inlined]
 [2] disable_sigint(::Cubature.var"#17#18"{Bool,Bool,Int64,Float64,Float64,Int64,Int32,Int64,Array{Float64,1},Array{Float64,1},Array{Float64,1},Array{Float64,1},Cubature.IntegrandData{Quadrature.var"#79#91"{CuArray{Float32,1}}},Ptr{Nothing}}) at ./c.jl:446
 [3] cubature(::Bool, ::Bool, ::Bool, ::Bool, ::Int64, ::Quadrature.var"#79#91"{CuArray{Float32,1}}, ::Array{Float64,1}, ::Array{Float64,1}, ::Float64, ::Float64, ::Int64, ::Int32) at /root/.julia/packages/Cubature/5zwuu/src/Cubature.jl:169
 [4] #pcubature_v#24 at /root/.julia/packages/Cubature/5zwuu/src/Cubature.jl:230 [inlined]
 [5] __solvebp_call(::QuadratureProblem{true,CuArray{Float32,1},typeof(g),Array{Float64,1},Array{Float64,1},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}}, ::CubatureJLp, ::Quadrature.ReCallVJP{Quadrature.ZygoteVJP}, ::Array{Float64,1}, ::Array{Float64,1}, ::CuArray{Float32,1}; reltol::Float64, abstol::Float64, maxiters::Int64, kwargs::Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}) at /root/.julia/packages/Quadrature/NPUfc/src/Quadrature.jl:307
 [6] #__solvebp#11 at /root/.julia/packages/Quadrature/NPUfc/src/Quadrature.jl:153 [inlined]
 [7] solve(::QuadratureProblem{true,CuArray{Float32,1},typeof(g),Array{Float64,1},Array{Float64,1},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}}, ::CubatureJLp; sensealg::Quadrature.ReCallVJP{Quadrature.ZygoteVJP}, kwargs::Base.Iterators.Pairs{Symbol,Float64,Tuple{Symbol,Symbol},NamedTuple{(:reltol, :abstol),Tuple{Float64,Float64}}}) at /root/.julia/packages/Quadrature/NPUfc/src/Quadrature.jl:149
 [8] testf3(::CuArray{Float32,1}, ::Nothing) at ./In[58]:24
 [9] top-level scope at In[58]:27
 [10] include_string(::Function, ::Module, ::String, ::String) at ./loading.jl:1091

The same is for the automatic differentiation. While it isn't work with GPU.

dp1 = ForwardDiff.gradient(p->testf3(p, nothing),p)
MethodError: no method matching ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3}(::Float64, ::ForwardDiff.Partials{3,Float32})
Closest candidates are:
  ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3}(::Number) where {T, V, N} at /root/.julia/packages/ForwardDiff/sqhTO/src/dual.jl:73
  ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3}(::T) where T<:Number at boot.jl:716
  ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3}(::V, ::ForwardDiff.Partials{N,V}) where {T, V, N} at /root/.julia/packages/ForwardDiff/sqhTO/src/dual.jl:17
  ...

Stacktrace:
 [1] __solvebp(::QuadratureProblem{true,CuArray{ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3},1},typeof(g),Array{Float64,1},Array{Float64,1},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}}, ::CubatureJLp, ::Quadrature.ReCallVJP{Quadrature.ZygoteVJP}, ::Array{Float64,1}, ::Array{Float64,1}, ::CuArray{ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3},1}; kwargs::Base.Iterators.Pairs{Symbol,Float64,Tuple{Symbol,Symbol},NamedTuple{(:reltol, :abstol),Tuple{Float64,Float64}}}) at /root/.julia/packages/Quadrature/NPUfc/src/Quadrature.jl:629
 [2] solve(::QuadratureProblem{true,CuArray{ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3},1},typeof(g),Array{Float64,1},Array{Float64,1},Base.Iterators.Pairs{Union{},Union{},Tuple{},NamedTuple{(),Tuple{}}}}, ::CubatureJLp; sensealg::Quadrature.ReCallVJP{Quadrature.ZygoteVJP}, kwargs::Base.Iterators.Pairs{Symbol,Float64,Tuple{Symbol,Symbol},NamedTuple{(:reltol, :abstol),Tuple{Float64,Float64}}}) at /root/.julia/packages/Quadrature/NPUfc/src/Quadrature.jl:149
 [3] testf3(::CuArray{ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3},1}, ::Nothing) at ./In[52]:26
 [4] #37 at ./In[54]:1 [inlined]
 [5] vector_mode_dual_eval at /root/.julia/packages/ForwardDiff/sqhTO/src/apiutils.jl:37 [inlined]
 [6] vector_mode_gradient(::var"#37#38", ::CuArray{Float32,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#37#38",Float32},Float32,3,CuArray{ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3},1}}) at /root/.julia/packages/ForwardDiff/sqhTO/src/gradient.jl:106
 [7] gradient(::Function, ::CuArray{Float32,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#37#38",Float32},Float32,3,CuArray{ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3},1}}, ::Val{true}) at /root/.julia/packages/ForwardDiff/sqhTO/src/gradient.jl:19
 [8] gradient(::Function, ::CuArray{Float32,1}, ::ForwardDiff.GradientConfig{ForwardDiff.Tag{var"#37#38",Float32},Float32,3,CuArray{ForwardDiff.Dual{ForwardDiff.Tag{var"#37#38",Float32},Float32,3},1}}) at /root/.julia/packages/ForwardDiff/sqhTO/src/gradient.jl:17 (repeats 2 times)
 [9] top-level scope at In[54]:1
 [10] include_string(::Function, ::Module, ::String, ::String) at ./loading.jl:1091
@lxvm
Copy link
Collaborator

lxvm commented Nov 3, 2023

@KirillZubov Would you be able to test this code again or provide a MWE? I think Integrals@4 should be able to handle this case

# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants