-
Notifications
You must be signed in to change notification settings - Fork 14
New issue
Have a question about this project? # for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “#”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? # to your account
Integrate with POI to improve UX #262
Conversation
Last commit is an attempt to solve the error raised by: using JuMP, DiffOpt, HiGHS
b = [1.0, 2.0]
m = Model(
() -> DiffOpt.diff_optimizer(
HiGHS.Optimizer;
with_parametric_opt_interface = true,
),
)
@variable(m, x[1:2] >= 0)
@variable(m, c[1:2] in MOI.Parameter.(b))
@constraint(m, con, x <= c)
@objective(m, Max, sum(x))
optimize!(m)
MOI.set(m, DiffOpt.ReverseVariablePrimal(), m[:x][1], 1.0)
DiffOpt.reverse_differentiate!(m) which is:
|
this still fails using JuMP, DiffOpt, SCS, LinearAlgebra
num_A = 2
##### SecondOrderCone #####
_x_hat = rand(num_A)
μ = rand(num_A) * 10
Σ_12 = rand(num_A, num_A)
Σ = Σ_12 * Σ_12' + 0.1 * I
γ = 1.0
model = direct_model(
DiffOpt.diff_optimizer(
SCS.Optimizer;
with_parametric_opt_interface = true,
),
)
# model = direct_model(POI.Optimizer(DiffOpt.diff_optimizer(SCS.Optimizer)))
set_silent(model)
@variable(model, x[1:num_A])
@variable(model, x_hat[1:num_A] in MOI.Parameter.(_x_hat))
@variable(model, norm_2)
# (x - x_hat)^T Σ^-1 (x - x_hat) <= γ
@constraint(
model,
(x - μ)' * inv(Σ) * (x - μ) <= γ,
)
# norm_2 >= ||x - x_hat||_2
@constraint(model, [norm_2; x - x_hat] in SecondOrderCone())
@objective(model, Min, norm_2)
optimize!(model)
# MOI.set.(model, POI.ForwardParameter(), x_hat, ones(num_A))
MOI.set.(model, DiffOpt.ForwardConstraintSet(), ParameterRef.(x_hat), 1)
DiffOpt.forward_differentiate!(model) # ERROR with:
if we switch to @objective(model, Min, 3*norm_2) it works. so it looks like a VariableIndex typed objective. Found a MWE: model = Model(
() -> DiffOpt.diff_optimizer(
SCS.Optimizer;
with_parametric_opt_interface = true,
),
)
MOI.set(model, DiffOpt.ModelConstructor(), DiffOpt.ConicProgram.Model)
set_silent(model)
@variable(model, x)
@variable(model, p in Parameter(3.0))
@constraint(model, cons, x >= 3 * p)
@objective(model, Min, x)
optimize!(model)
DiffOpt.set_forward_parameter(model, p, 1)
DiffOpt.forward_differentiate!(model) the issue is conic backend + VariabelIndex objective
|
@blegat, Looking at So the error might be where the bridge is being triggered? if I add:
the error goes away, but I am not sure if this is correct. |
Maybe we need to start reviving #253 otherwise we'll spend time on a fix that will need to be updated once we merge it |
Agreed! |
Let me know if rebasing on master fixed the error |
bridging errors were fixed by #253 |
Codecov ReportAttention: Patch coverage is
Additional details and impacted files@@ Coverage Diff @@
## master #262 +/- ##
==========================================
+ Coverage 85.38% 86.77% +1.39%
==========================================
Files 12 13 +1
Lines 1156 1474 +318
==========================================
+ Hits 987 1279 +292
- Misses 169 195 +26 ☔ View full report in Codecov by Sentry. |
Not a WIP anymore, this is ready for review. |
I'd say |
Co-authored-by: Benoît Legat <benoit.legat@gmail.com>
* [WIP] Integrate with POI to improve UX * add missing import * temp change to proj toml * format * simplify method setting to sue model constructor * add possible fix to scalarize bridge error * add pkg to project * format * improvements * remove jump wrapper * clean tests * fix readme * use intermediary API * format * Apply suggestions from code review Co-authored-by: Benoît Legat <benoit.legat@gmail.com> * add suggestion * use Parameter set * todo was fixed * format * update docs for newer Flux * format * kwargs * remove diff model * suggestions * format * fix examples --------- Co-authored-by: Benoît Legat <benoit.legat@gmail.com>
* NonLinearProgram * index by MOI index * only cache gradient * update API * start reverse mode * add overloads * update MOI wrapper * update code for DiffOpt API * working code * usage example * add reverse diff * update code * update tests * update tests * add forward_differentiate! tests * add reverse_differentiate! tests * update docs * format * update API reference * fix typos * update reference * update spdiagm * Typo "acutal" to "actual" (#258) Correcting typo "acutal" to "actual" * Fix GitHub actions badge in README (#263) * Implement MOI.Utilities.scalar_type for (Matrix|Sparse)VectorAffineFunction (#264) * Use SlackBridgePrimalDualStart (#253) * Use SlackBridgePrimalDualStart * Update src/copy_dual.jl * Remove test_broken * Add supports * Add comment * Move to AbstractModel * Integrate with POI to improve UX (#262) * [WIP] Integrate with POI to improve UX * add missing import * temp change to proj toml * format * simplify method setting to sue model constructor * add possible fix to scalarize bridge error * add pkg to project * format * improvements * remove jump wrapper * clean tests * fix readme * use intermediary API * format * Apply suggestions from code review Co-authored-by: Benoît Legat <benoit.legat@gmail.com> * add suggestion * use Parameter set * todo was fixed * format * update docs for newer Flux * format * kwargs * remove diff model * suggestions * format * fix examples --------- Co-authored-by: Benoît Legat <benoit.legat@gmail.com> * Add error for missing starting value (#269) * update API * expose kwargs * restrict hessian type * reverse wrong change * update usage * fix mad merge * fix typo * fix typo * fix wrong index * reverse index * allow user to just set relevat sensitivities * fix copy reverse sensitivity dual * format * update tests * format * update docs * extend parameter @test_throws tests for NLP * update comments * update private api: _add_leq_geq * fix typo * continue fix typo check asserts * expose factorization through as MOI.AbstractModelAttribute * add tests factorization * add comment * rm rm kwargs * use correct underscore signature for private funcs * format * change github actions to v3 * reverse checkout version * add reference sipopt paper * update factorization routine API * format * Update ci.yml * improve coverage * add test inertia correction * add test ReverseConstraintDual * format * rm useless checks * add test get ReverseConstraintDual * format * rm unecessary funcs * rm kwargs * format * rename factorization attributte * add supports * Apply suggestions from code review --------- Co-authored-by: mzagorowska <7868389+mzagorowska@users.noreply.github.com> Co-authored-by: Oscar Dowson <odow@users.noreply.github.com> Co-authored-by: Benoît Legat <benoit.legat@gmail.com> Co-authored-by: Joaquim <joaquimdgarcia@gmail.com>
Add multiple improvements to DiffOpt leading to a massive UX improvement.
with_parametric_opt_interface = true
using Prepare integration with DiffOpt ParametricOptInterface.jl#161method
(maybe switch tomodel_constructor
)empty_input_sensitivities!
( close Resetting differentiation input in-between differentiations #259)About the integration with POI:
with_parametric_opt_interface
is false by default, for now. I want to make it true.(Forward/Reverse)ConstraintSet
forMOI.Parameters
is supported (ObjectiveFunction
andConstraintFunction
are not supported)ConstraintSet
only woks forMOI.Parameters
. We can addEqualTo
etc in a future PR.TODO:
decide onnow Add a JuMP-like syntax to diff wrt parameters #271set_forward_parameter
,set_forward
or onlyMOI. ...
. The first two make total sense for JuMP users, the MOI method will remain in the code for MOI users and extensions. I am strongly in favor of a JuMP-like method.