Skip to content
New issue

Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? # to your account

Integrate with POI to improve UX #262

Merged
merged 28 commits into from
Jan 31, 2025
Merged

Integrate with POI to improve UX #262

merged 28 commits into from
Jan 31, 2025

Conversation

joaquimg
Copy link
Member

@joaquimg joaquimg commented Dec 27, 2024

Add multiple improvements to DiffOpt leading to a massive UX improvement.

About the integration with POI:

  • with_parametric_opt_interface is false by default, for now. I want to make it true.
  • when using POI, only (Forward/Reverse)ConstraintSet for MOI.Parameters is supported (ObjectiveFunction and ConstraintFunction are not supported)
  • for now, ConstraintSet only woks for MOI.Parameters. We can add EqualTo etc in a future PR.
  • the pure usage of sensitivities plays well with 🚀 Add NonLinearProgram Support to DiffOpt.jl #260

TODO:

@joaquimg joaquimg requested a review from blegat December 27, 2024 23:45
@joaquimg
Copy link
Member Author

Last commit is an attempt to solve the error raised by:

    using JuMP, DiffOpt, HiGHS

    b = [1.0, 2.0]

    m = Model(
        () -> DiffOpt.diff_optimizer(
            HiGHS.Optimizer;
            with_parametric_opt_interface = true,
        ),
    )
    @variable(m, x[1:2] >= 0)
    @variable(m, c[1:2] in MOI.Parameter.(b))
    @constraint(m, con, x <= c)
    @objective(m, Max, sum(x))
    optimize!(m)

    MOI.set(m, DiffOpt.ReverseVariablePrimal(), m[:x][1], 1.0)
    DiffOpt.reverse_differentiate!(m)

which is:

ERROR: ArgumentError: Bridge of type `ScalarizeBridge` does not support accessing the attribute `DiffOpt.ReverseConstraintFunction()`. If you encountered this error unexpectedly, it probably means your model has been reformulated using the bridge, and you are attempting to query an attribute that we haven't implemented yet for this bridge. Please open an issue at https://github.com/jump-dev/MathOptInterface.jl/issues/new and provide a reproducible example explaining what you were trying to do.
Stacktrace:
  [1] get(::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, attr::DiffOpt.ReverseConstraintFunction, bridge::MathOptInterface.Bridges.Constraint.ScalarizeBridge{…})
    @ MathOptInterface.Bridges C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\bridge.jl:149
  [2] (::MathOptInterface.Bridges.var"#3#4"{…})(bridge::MathOptInterface.Bridges.Constraint.ScalarizeBridge{…})
    @ MathOptInterface.Bridges C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\bridge_optimizer.jl:324
  [3] (::MathOptInterface.Bridges.var"#1#2"{…})()
    @ MathOptInterface.Bridges C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\bridge_optimizer.jl:309
  [4] call_in_context(map::MathOptInterface.Bridges.Variable.Map, bridge_index::Int64, f::MathOptInterface.Bridges.var"#1#2"{…})
    @ MathOptInterface.Bridges.Variable C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\Variable\map.jl:620
  [5] call_in_context
    @ C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\Variable\map.jl:651 [inlined]
  [6] call_in_context
    @ C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\bridge_optimizer.jl:306 [inlined]
  [7] call_in_context
    @ C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\bridge_optimizer.jl:321 [inlined]
  [8] get(b::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, attr::DiffOpt.ReverseConstraintFunction, ci::MathOptInterface.ConstraintIndex{…})
    @ MathOptInterface.Bridges C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\bridge_optimizer.jl:1497
  [9] get(model::DiffOpt.Optimizer{…}, attr::DiffOpt.ReverseConstraintFunction, ci::MathOptInterface.ConstraintIndex{…})
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\moi_wrapper.jl:715
 [10] _constraint_get_reverse!(model::ParametricOptInterface.Optimizer{…}, vector_affine_constraint_cache_dict::MathOptInterface.Utilities.DoubleDicts.DoubleDictInner{…}, ::Type{…})
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\parameters.jl:384
 [11] reverse_differentiate!(model::ParametricOptInterface.Optimizer{Float64, DiffOpt.Optimizer{…}})
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\parameters.jl:533
 [12] reverse_differentiate!
    @ C:\JG\Julia\dev\DiffOpt\src\jump_moi_overloads.jl:333 [inlined]
 [13] reverse_differentiate!(model::MathOptInterface.Utilities.CachingOptimizer{…})
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\jump_moi_overloads.jl:318
 [14] reverse_differentiate!(model::Model)
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\jump_moi_overloads.jl:303

@joaquimg
Copy link
Member Author

joaquimg commented Dec 28, 2024

this still fails

using JuMP, DiffOpt, SCS, LinearAlgebra

    num_A = 2
    ##### SecondOrderCone #####
    _x_hat = rand(num_A)
    μ = rand(num_A) * 10
    Σ_12 = rand(num_A, num_A)
    Σ = Σ_12 * Σ_12' + 0.1 * I
    γ = 1.0
    model = direct_model(
        DiffOpt.diff_optimizer(
            SCS.Optimizer;
            with_parametric_opt_interface = true,
        ),
    )
    # model = direct_model(POI.Optimizer(DiffOpt.diff_optimizer(SCS.Optimizer)))
    set_silent(model)
    @variable(model, x[1:num_A])
    @variable(model, x_hat[1:num_A] in MOI.Parameter.(_x_hat))
    @variable(model, norm_2)
    # (x - x_hat)^T Σ^-1 (x - x_hat) <= γ
    @constraint(
        model,
        (x - μ)' * inv(Σ) * (x - μ) <= γ,
    )
    # norm_2 >= ||x - x_hat||_2
    @constraint(model, [norm_2; x - x_hat] in SecondOrderCone())
    @objective(model, Min, norm_2)
    optimize!(model)
    # MOI.set.(model, POI.ForwardParameter(), x_hat, ones(num_A))
    MOI.set.(model, DiffOpt.ForwardConstraintSet(), ParameterRef.(x_hat), 1)
    DiffOpt.forward_differentiate!(model) # ERROR

with:

ERROR: MethodError: no method matching throw_set_error_fallback(::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, ::DiffOpt.ObjectiveFunctionAttribute{…}, ::MathOptInterface.Bridges.Objective.FunctionConversionBridge{…}, ::Float64)

Closest candidates are:
  throw_set_error_fallback(::MathOptInterface.ModelLike, ::Union{MathOptInterface.AbstractModelAttribute, MathOptInterface.AbstractOptimizerAttribute}, ::Any; error_if_supported)
   @ MathOptInterface C:\JG\Julia\packages\MathOptInterface\gLl4d\src\attributes.jl:580
  throw_set_error_fallback(::MathOptInterface.ModelLike, ::MathOptInterface.Bridges.Objective.SlackBridgePrimalDualStart, ::MathOptInterface.Bridges.Objective.AbstractBridge, ::Nothing)
   @ MathOptInterface C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\Objective\bridges\slack.jl:202
  throw_set_error_fallback(::MathOptInterface.ModelLike, ::DiffOpt.ObjectiveSlackGapPrimalStart, ::Any)
   @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\copy_dual.jl:45
  ...

Stacktrace:
 [1] set(::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, ::DiffOpt.ObjectiveFunctionAttribute{…}, ::MathOptInterface.Bridges.Objective.FunctionConversionBridge{…}, ::Float64)
   @ MathOptInterface C:\JG\Julia\packages\MathOptInterface\gLl4d\src\attributes.jl:553
 [2] set(b::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, attr::DiffOpt.ObjectiveFunctionAttribute{…}, value::Float64)
   @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\copy_dual.jl:90
 [3] set(b::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, attr::DiffOpt.ObjectiveDualStart, value::Float64)
   @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\copy_dual.jl:114
 [4] _copy_dual(dest::MathOptInterface.Bridges.LazyBridgeOptimizer{…}, src::MathOptInterface.Utilities.CachingOptimizer{…}, index_map::MathOptInterface.Utilities.IndexMap)
   @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\copy_dual.jl:176
 [5] _diff(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{…}})
   @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\moi_wrapper.jl:627
 [6] forward_differentiate!(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{…}})
   @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\moi_wrapper.jl:547
 [7] forward_differentiate!(model::ParametricOptInterface.Optimizer{Float64, DiffOpt.Optimizer{…}})
   @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\parameters.jl:302
 [8] forward_differentiate!(model::Model)
   @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\jump_moi_overloads.jl:307

if we switch to

 @objective(model, Min, 3*norm_2)

it works. so it looks like a VariableIndex typed objective.
but only happens if there is a SOC.

Found a MWE:

    model = Model(
        () -> DiffOpt.diff_optimizer(
            SCS.Optimizer;
            with_parametric_opt_interface = true,
        ),
    )
    MOI.set(model, DiffOpt.ModelConstructor(), DiffOpt.ConicProgram.Model)
    set_silent(model)
    @variable(model, x)
    @variable(model, p in Parameter(3.0))
    @constraint(model, cons, x >= 3 * p)

    @objective(model, Min, x)
    optimize!(model)
    DiffOpt.set_forward_parameter(model, p, 1)
    DiffOpt.forward_differentiate!(model)

the issue is conic backend + VariabelIndex objective
which leads to the full error:

julia> show(err)
1-element ExceptionStack:
MethodError: no method matching throw_set_error_fallback(::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, ::DiffOpt.ObjectiveFunctionAttribute{DiffOpt.ObjectiveDualStart, MathOptInterface.VariableIndex}, ::MathOptInterface.Bridges.Objective.FunctionConversionBridge{Float64, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.VariableIndex}, ::Float64)

Closest candidates are:
  throw_set_error_fallback(::MathOptInterface.ModelLike, ::Union{MathOptInterface.AbstractModelAttribute, MathOptInterface.AbstractOptimizerAttribute}, ::Any; error_if_supported)
   @ MathOptInterface C:\JG\Julia\packages\MathOptInterface\gLl4d\src\attributes.jl:580
  throw_set_error_fallback(::MathOptInterface.ModelLike, ::MathOptInterface.Bridges.Objective.SlackBridgePrimalDualStart, ::MathOptInterface.Bridges.Objective.AbstractBridge, ::Nothing)
   @ MathOptInterface C:\JG\Julia\packages\MathOptInterface\gLl4d\src\Bridges\Objective\bridges\slack.jl:202
  throw_set_error_fallback(::MathOptInterface.ModelLike, ::DiffOpt.ObjectiveSlackGapPrimalStart, ::Any)
   @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\copy_dual.jl:45
  ...

Stacktrace:
  [1] set(::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, ::DiffOpt.ObjectiveFunctionAttribute{DiffOpt.ObjectiveDualStart, MathOptInterface.VariableIndex}, ::MathOptInterface.Bridges.Objective.FunctionConversionBridge{Float64, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.VariableIndex}, ::Float64)
    @ MathOptInterface C:\JG\Julia\packages\MathOptInterface\gLl4d\src\attributes.jl:553
  [2] set(b::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, attr::DiffOpt.ObjectiveFunctionAttribute{DiffOpt.ObjectiveDualStart, MathOptInterface.VariableIndex}, value::Float64)
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\copy_dual.jl:90
  [3] set(b::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, attr::DiffOpt.ObjectiveDualStart,
 value::Float64)
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\copy_dual.jl:114
  [4] _copy_dual(dest::MathOptInterface.Bridges.LazyBridgeOptimizer{DiffOpt.ConicProgram.Model}, src::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{SCS.Optimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}, index_map::MathOptInterface.Utilities.IndexMap)
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\copy_dual.jl:176
  [5] _diff(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{SCS.Optimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}})
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\moi_wrapper.jl:627
  [6] forward_differentiate!(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{SCS.Optimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}})
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\moi_wrapper.jl:547
  [7] forward_differentiate!(model::ParametricOptInterface.Optimizer{Float64, DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{SCS.Optimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}})
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\parameters.jl:302
  [8] forward_differentiate!
    @ C:\JG\Julia\dev\DiffOpt\src\jump_moi_overloads.jl:337 [inlined]
  [9] forward_differentiate!(model::MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{ParametricOptInterface.Optimizer{Float64, DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{MathOptInterface.Utilities.CachingOptimizer{SCS.Optimizer, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}}}}}, MathOptInterface.Utilities.UniversalFallback{MathOptInterface.Utilities.Model{Float64}}})
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\jump_moi_overloads.jl:322
 [10] forward_differentiate!(model::Model)
    @ DiffOpt C:\JG\Julia\dev\DiffOpt\src\jump_moi_overloads.jl:307
 [11] top-level scope
    @ REPL[226]:1

@joaquimg
Copy link
Member Author

@blegat,
in the above,

Looking at MathOptInterface.Bridges.Objective.FunctionConversionBridge{Float64, MathOptInterface.ScalarAffineFunction{Float64}, MathOptInterface.VariableIndex}
it seems the bridge is converting VI to SAF, which is not a comment conversion in the docs of such a bridge.

So the error might be where the bridge is being triggered?

if I add:

function MOI.set(
    model::MOI.ModelLike,
    ::ObjectiveFunctionAttribute{ObjectiveDualStart},
    b::MOI.Bridges.Objective.FunctionConversionBridge,
    value,
)
    return
end

function MOI.set(
    model::MOI.ModelLike,
    ::ObjectiveFunctionAttribute{ObjectiveSlackGapPrimalStart},
    b::MOI.Bridges.Objective.FunctionConversionBridge,
    value,
)
    return
end

the error goes away, but I am not sure if this is correct.

@blegat
Copy link
Member

blegat commented Jan 7, 2025

Maybe we need to start reviving #253 otherwise we'll spend time on a fix that will need to be updated once we merge it

@joaquimg
Copy link
Member Author

joaquimg commented Jan 7, 2025

Agreed!

@blegat
Copy link
Member

blegat commented Jan 8, 2025

Let me know if rebasing on master fixed the error

@joaquimg
Copy link
Member Author

bridging errors were fixed by #253

Copy link

codecov bot commented Jan 12, 2025

Codecov Report

Attention: Patch coverage is 88.28829% with 39 lines in your changes missing coverage. Please review.

Project coverage is 86.77%. Comparing base (07478f8) to head (35cd234).
Report is 4 commits behind head on master.

Files with missing lines Patch % Lines
src/parameters.jl 94.48% 16 Missing ⚠️
src/jump_moi_overloads.jl 50.00% 9 Missing ⚠️
src/utils.jl 0.00% 9 Missing ⚠️
src/bridges.jl 0.00% 5 Missing ⚠️
Additional details and impacted files
@@            Coverage Diff             @@
##           master     #262      +/-   ##
==========================================
+ Coverage   85.38%   86.77%   +1.39%     
==========================================
  Files          12       13       +1     
  Lines        1156     1474     +318     
==========================================
+ Hits          987     1279     +292     
- Misses        169      195      +26     

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@joaquimg joaquimg changed the title [WIP] Integrate with POI to improve UX Integrate with POI to improve UX Jan 12, 2025
@joaquimg
Copy link
Member Author

Not a WIP anymore, this is ready for review.

@blegat
Copy link
Member

blegat commented Jan 13, 2025

I'd say set_forward_parameter is more explicit than set_forward. We wouldn't need the _parameter if the fact that the variable is a parameter is part of the signature of the method but here the type of the variable is VariableRef and the fact that it's a parameter is only due to the state of the optimizer so set_forward_parameter seems appropriate. On the other hand, it's not like we support it for something else than parameters so why do we have to be so explicit, set_forward is maybe enough.

joaquimg and others added 2 commits January 13, 2025 09:49
Co-authored-by: Benoît Legat <benoit.legat@gmail.com>
@joaquimg joaquimg requested a review from blegat January 14, 2025 06:37
@joaquimg joaquimg merged commit 6dd82d6 into master Jan 31, 2025
10 checks passed
@joaquimg joaquimg deleted the jg/parameters branch January 31, 2025 23:58
andrewrosemberg pushed a commit to andrewrosemberg/DiffOpt.jl that referenced this pull request Feb 5, 2025
* [WIP] Integrate with POI to improve UX

* add missing import

* temp change to proj toml

* format

* simplify method setting to sue model constructor

* add possible fix to scalarize bridge error

* add pkg to project

* format

* improvements

* remove jump wrapper

* clean tests

* fix readme

* use intermediary API

* format

* Apply suggestions from code review

Co-authored-by: Benoît Legat <benoit.legat@gmail.com>

* add suggestion

* use Parameter set

* todo was fixed

* format

* update docs for newer Flux

* format

* kwargs

* remove diff model

* suggestions

* format

* fix examples

---------

Co-authored-by: Benoît Legat <benoit.legat@gmail.com>
joaquimg added a commit that referenced this pull request Feb 21, 2025
* NonLinearProgram

* index by MOI index

* only cache gradient

* update API

* start reverse mode

* add overloads

* update MOI wrapper

* update code for DiffOpt API

* working code

* usage example

* add reverse diff

* update code

* update tests

* update tests

* add forward_differentiate! tests

* add reverse_differentiate! tests

* update docs

* format

* update API reference

* fix typos

* update reference

* update spdiagm

* Typo "acutal" to "actual" (#258)

Correcting typo "acutal" to "actual"

* Fix GitHub actions badge in README (#263)

* Implement MOI.Utilities.scalar_type for (Matrix|Sparse)VectorAffineFunction (#264)

* Use SlackBridgePrimalDualStart (#253)

* Use SlackBridgePrimalDualStart

* Update src/copy_dual.jl

* Remove test_broken

* Add supports

* Add comment

* Move to AbstractModel

* Integrate with POI to improve UX (#262)

* [WIP] Integrate with POI to improve UX

* add missing import

* temp change to proj toml

* format

* simplify method setting to sue model constructor

* add possible fix to scalarize bridge error

* add pkg to project

* format

* improvements

* remove jump wrapper

* clean tests

* fix readme

* use intermediary API

* format

* Apply suggestions from code review

Co-authored-by: Benoît Legat <benoit.legat@gmail.com>

* add suggestion

* use Parameter set

* todo was fixed

* format

* update docs for newer Flux

* format

* kwargs

* remove diff model

* suggestions

* format

* fix examples

---------

Co-authored-by: Benoît Legat <benoit.legat@gmail.com>

* Add error for missing starting value (#269)

* update API

* expose kwargs

* restrict hessian type

* reverse wrong change

* update usage

* fix mad merge

* fix typo

* fix typo

* fix wrong index

* reverse index

* allow user to just set relevat sensitivities

* fix copy reverse sensitivity dual

* format

* update tests

* format

* update docs

* extend parameter @test_throws tests for NLP

* update comments

* update private api: _add_leq_geq

* fix typo

* continue fix typo check asserts

* expose factorization through as MOI.AbstractModelAttribute

* add tests factorization

* add comment

* rm rm kwargs

* use correct underscore signature for private funcs

* format

* change github actions to v3

* reverse checkout version

* add reference sipopt paper

* update factorization routine API

* format

* Update ci.yml

* improve coverage

* add test inertia correction

* add test ReverseConstraintDual

* format

* rm useless checks

* add test get ReverseConstraintDual

* format

* rm unecessary funcs

* rm kwargs

* format

* rename factorization attributte

* add supports

* Apply suggestions from code review

---------

Co-authored-by: mzagorowska <7868389+mzagorowska@users.noreply.github.com>
Co-authored-by: Oscar Dowson <odow@users.noreply.github.com>
Co-authored-by: Benoît Legat <benoit.legat@gmail.com>
Co-authored-by: Joaquim <joaquimdgarcia@gmail.com>
# for free to join this conversation on GitHub. Already have an account? # to comment
Labels
None yet
Development

Successfully merging this pull request may close these issues.

Resetting differentiation input in-between differentiations
2 participants