Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ForwardObjective Function #222

Merged
merged 1 commit into from
Apr 28, 2022
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
2 changes: 1 addition & 1 deletion docs/src/examples/autotuning-ridge.jl
Original file line number Diff line number Diff line change
Expand Up @@ -109,7 +109,7 @@ function compute_dw_dα(model, w)
dw_dα = zeros(D)
MOI.set(
model,
DiffOpt.ForwardObjective(),
DiffOpt.ForwardObjectiveFunction(),
dot(w, w) / (2 * D),
)
DiffOpt.forward_differentiate!(model)
Expand Down
2 changes: 1 addition & 1 deletion docs/src/examples/chainrules_unit.jl
Original file line number Diff line number Diff line change
Expand Up @@ -167,7 +167,7 @@ function ChainRulesCore.frule(

## setting the perturbation of the linear objective
Δobj = sum(Δgen_costs ⋅ p[:,t] + Δnoload_costs ⋅ u[:,t] for t in size(p, 2))
MOI.set(model, DiffOpt.ForwardObjective(), Δobj)
MOI.set(model, DiffOpt.ForwardObjectiveFunction(), Δobj)
DiffOpt.forward_differentiate!(JuMP.backend(model))
## querying the corresponding perturbation of the decision
Δp = MOI.get.(model, DiffOpt.ForwardVariablePrimal(), p)
Expand Down
10 changes: 5 additions & 5 deletions docs/src/examples/sensitivity-analysis-ridge.jl
Original file line number Diff line number Diff line change
Expand Up @@ -106,12 +106,12 @@ b̂ = value(b)
# variable `x`.

# Recalling that the points $(x_i, y_i)$ appear in the objective function as:
# `(yi - b - w*xi)^2`, the `DiffOpt.ForwardObjective` attribute must be set accordingly,
# `(yi - b - w*xi)^2`, the `DiffOpt.ForwardObjectiveFunction` attribute must be set accordingly,
# with the terms multiplying the parameter in the objective.
# When considering the perturbation of a parameter θ, `DiffOpt.ForwardObjective()` takes in the expression in the
# When considering the perturbation of a parameter θ, `DiffOpt.ForwardObjectiveFunction()` takes in the expression in the
# objective that multiplies θ.
# If θ appears with a quadratic and a linear form: `θ^2 a x + θ b y`, then the expression to pass to
# `ForwardObjective` is `2θ a x + b y`.
# `ForwardObjectiveFunction` is `2θ a x + b y`.

# Sensitivity with respect to x and y

Expand All @@ -120,7 +120,7 @@ b̂ = value(b)
for i in 1:N
MOI.set(
model,
DiffOpt.ForwardObjective(),
DiffOpt.ForwardObjectiveFunction(),
2w^2 * X[i] + 2b * w - 2 * w * Y[i]
)
DiffOpt.forward_differentiate!(model)
Expand All @@ -131,7 +131,7 @@ for i in 1:N
)
MOI.set(
model,
DiffOpt.ForwardObjective(),
DiffOpt.ForwardObjectiveFunction(),
(2Y[i] - 2b - 2w * X[i]),
)
DiffOpt.forward_differentiate!(model)
Expand Down
2 changes: 1 addition & 1 deletion docs/src/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@ grad_con = MOI.get.(model, DiffOpt.ReverseConstraintFunction(), c)
we can use the `forward_differentiate!` method with perturbations in matrices `A`, `b`, `c`:
```julia
import LinearAlgebra: ⋅
MOI.set(model, DiffOpt.ForwardObjective(), ones(2) ⋅ x)
MOI.set(model, DiffOpt.ForwardObjectiveFunction(), ones(2) ⋅ x)
DiffOpt.forward_differentiate!(model)
grad_x = MOI.get.(model, DiffOpt.ForwardVariablePrimal(), x)
```
2 changes: 1 addition & 1 deletion src/deprecated.jl
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@

@deprecate ForwardOutVariablePrimal() ForwardVariablePrimal() false
@deprecate ForwardInConstraint() ForwardConstraintFunction() false
@deprecate ForwardInObjective() ForwardObjective() false
@deprecate ForwardInObjective() ForwardObjectiveFunction() false

@deprecate QPForwBackCache(args...) QuadraticForwardReverseCache(args...) false
@deprecate ConicBackCache(args...) ConicReverseCache(args...) false
Expand Down
10 changes: 5 additions & 5 deletions src/diff_opt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -57,7 +57,7 @@ function Base.empty!(cache::DiffInputCache)
end

"""
ForwardObjective <: MOI.AbstractModelAttribute
ForwardObjectiveFunction <: MOI.AbstractModelAttribute

A `MOI.AbstractModelAttribute` to set input data to forward differentiation, that
is, problem input data.
Expand All @@ -68,11 +68,11 @@ quadratic models.
For instance, if the objective contains `θ * (x + 2y)`, for the purpose of
computing the derivative with respect to `θ`, the following should be set:
```julia
MOI.set(model, DiffOpt.ForwardObjective(), 1.0 * x + 2.0 * y)
MOI.set(model, DiffOpt.ForwardObjectiveFunction(), 1.0 * x + 2.0 * y)
```
where `x` and `y` are the relevant `MOI.VariableIndex`.
"""
struct ForwardObjective <: MOI.AbstractModelAttribute end
struct ForwardObjectiveFunction <: MOI.AbstractModelAttribute end

"""
ForwardConstraintFunction <: MOI.AbstractConstraintAttribute
Expand All @@ -99,7 +99,7 @@ A `MOI.AbstractVariableAttribute` to get output data from forward
differentiation, that is, problem solution.

For instance, to get the tangent of the variable of index `vi` corresponding to
the tangents given to `ForwardObjective` and `ForwardConstraintFunction`, do the
the tangents given to `ForwardObjectiveFunction` and `ForwardConstraintFunction`, do the
following:
```julia
MOI.get(model, DiffOpt.ForwardVariablePrimal(), vi)
Expand Down Expand Up @@ -239,7 +239,7 @@ function MOI.set(model::DiffModel, ::MOI.VariablePrimalStart, vi::MOI.VariableIn
_enlarge_set(model.x, vi.value, value)
end

function MOI.set(model::DiffModel, ::ForwardObjective, objective)
function MOI.set(model::DiffModel, ::ForwardObjectiveFunction, objective)
model.input_cache.objective = objective
return
end
Expand Down
4 changes: 2 additions & 2 deletions src/jump_moi_overloads.jl
Original file line number Diff line number Diff line change
@@ -1,8 +1,8 @@
function MOI.set(model::JuMP.Model, attr::ForwardObjective, func::JuMP.AbstractJuMPScalar)
function MOI.set(model::JuMP.Model, attr::ForwardObjectiveFunction, func::JuMP.AbstractJuMPScalar)
JuMP.check_belongs_to_model(func, model)
return MOI.set(model, attr, JuMP.moi_function(func))
end
MOI.set(model::JuMP.Model, attr::ForwardObjective, func::Number) = MOI.set(model, attr, JuMP.AffExpr(func))
MOI.set(model::JuMP.Model, attr::ForwardObjectiveFunction, func::Number) = MOI.set(model, attr, JuMP.AffExpr(func))

function MOI.set(model::JuMP.Model, attr::ForwardConstraintFunction, con_ref::JuMP.ConstraintRef, func::JuMP.AbstractJuMPScalar)
JuMP.check_belongs_to_model(func, model)
Expand Down
8 changes: 4 additions & 4 deletions src/moi_wrapper.jl
Original file line number Diff line number Diff line change
Expand Up @@ -497,7 +497,7 @@ end
Wrapper method for the forward pass.
This method will consider as input a currently solved problem and
differentials with respect to problem data set with
the [`ForwardObjective`](@ref) and [`ForwardConstraintFunction`](@ref) attributes.
the [`ForwardObjectiveFunction`](@ref) and [`ForwardConstraintFunction`](@ref) attributes.
The output solution differentials can be queried with the attribute
[`ForwardVariablePrimal`](@ref).
"""
Expand All @@ -508,7 +508,7 @@ function forward_differentiate!(model::Optimizer)
end
diff = _diff(model)
if model.input_cache.objective !== nothing
MOI.set(diff, ForwardObjective(), MOI.Utilities.map_indices(model.index_map, model.input_cache.objective))
MOI.set(diff, ForwardObjectiveFunction(), MOI.Utilities.map_indices(model.index_map, model.input_cache.objective))
end
for (F, S) in keys(model.input_cache.scalar_constraints.dict)
_copy_forward_in_constraint(diff, model.index_map, model.index_map.con_map[F, S], model.input_cache.scalar_constraints[F, S])
Expand Down Expand Up @@ -578,10 +578,10 @@ function MOI.get(model::Optimizer, attr::ReverseObjectiveFunction)
model.index_map,
)
end
function MOI.get(model::Optimizer, ::ForwardObjective)
function MOI.get(model::Optimizer, ::ForwardObjectiveFunction)
return model.input_cache.objective
end
function MOI.set(model::Optimizer, ::ForwardObjective, objective)
function MOI.set(model::Optimizer, ::ForwardObjectiveFunction, objective)
model.input_cache.objective = objective
return
end
Expand Down
8 changes: 4 additions & 4 deletions test/moi_wrapper.jl
Original file line number Diff line number Diff line change
Expand Up @@ -661,7 +661,7 @@ function simple_psd(solver)

# test2: changing X[1], X[3] but keeping the objective (their sum) same
MOI.set(model, DiffOpt.ForwardConstraintFunction(), c, MOIU.zero_with_output_dimension(MOI.VectorAffineFunction{Float64}, 1))
MOI.set(model, DiffOpt.ForwardObjective(), -1.0X[1] + 1.0X[3])
MOI.set(model, DiffOpt.ForwardObjectiveFunction(), -1.0X[1] + 1.0X[3])

DiffOpt.forward_differentiate!(model)

Expand Down Expand Up @@ -875,7 +875,7 @@ end
MOI.optimize!(model)

# dc = ones(7)
MOI.set(model, DiffOpt.ForwardObjective(), MOI.ScalarAffineFunction(MOI.ScalarAffineTerm.(ones(7), x), 0.0))
MOI.set(model, DiffOpt.ForwardObjectiveFunction(), MOI.ScalarAffineFunction(MOI.ScalarAffineTerm.(ones(7), x), 0.0))
# db = ones(11)
# dA = ones(11, 7)
MOI.set(model,
Expand Down Expand Up @@ -993,7 +993,7 @@ end
dA = zeros(6, 1)
db = zeros(6)
MOI.set(model, DiffOpt.ForwardConstraintFunction(), c, MOIU.zero_with_output_dimension(VAF, 6))
MOI.set(model, DiffOpt.ForwardObjective(), 1.0 * x)
MOI.set(model, DiffOpt.ForwardObjectiveFunction(), 1.0 * x)

DiffOpt.forward_differentiate!(model)

Expand Down Expand Up @@ -1146,7 +1146,7 @@ end

# test 2
MOI.set(model, DiffOpt.ForwardConstraintFunction(), c, _vaf(zeros(6)))
MOI.set(model, DiffOpt.ForwardObjective(), 1.0 * x)
MOI.set(model, DiffOpt.ForwardObjectiveFunction(), 1.0 * x)

DiffOpt.forward_differentiate!(model)

Expand Down
2 changes: 1 addition & 1 deletion test/utils.jl
Original file line number Diff line number Diff line change
Expand Up @@ -208,7 +208,7 @@ function qp_test(
deqf = dAf * v .- dbf

@testset "Forward pass" begin
MOI.set(model, DiffOpt.ForwardObjective(), dobjf)
MOI.set(model, DiffOpt.ForwardObjectiveFunction(), dobjf)
for (j, jc) in enumerate(cle)
func = dlef[j]
canonicalize && MOI.Utilities.canonicalize!(func)
Expand Down