Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Remove unnecessary calls to Base.eval #243

Closed
wants to merge 3 commits into from
Closed
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
22 changes: 12 additions & 10 deletions src/bounding_model.jl
Original file line number Diff line number Diff line change
Expand Up @@ -3,10 +3,10 @@

Set up a MILP bounding model base on variable domain partitioning information stored in `use_disc`.
By default, if `use_disc` is not provided, it will use `m.discretizations` store in the Alpine model.
The basic idea of this MILP bounding model is to use piecewise polyhedral/convex relaxations to tighten the
basic relaxations of the original non-convex region. Among all presented partitions, the bounding model
will choose one specific partition as the lower bound solution. The more partitions there are, the
better or finer bounding model relax the original MINLP while the more efforts required to solve
The basic idea of this MILP bounding model is to use piecewise polyhedral/convex relaxations to tighten the
basic relaxations of the original non-convex region. Among all presented partitions, the bounding model
will choose one specific partition as the lower bound solution. The more partitions there are, the
better or finer bounding model relax the original MINLP while the more efforts required to solve
this MILP is required.
"""
function create_bounding_mip(m::Optimizer; use_disc = nothing)
Expand Down Expand Up @@ -196,7 +196,11 @@ function amp_post_lifted_objective(m::Optimizer)

#if isa(m.obj_expr_orig, Number)
if expr_isconst(m.obj_expr_orig)
JuMP.@objective(m.model_mip, m.sense_orig, eval(m.obj_expr_orig))
if m.obj_expr_orig == :(+())
JuMP.@objective(m.model_mip, m.sense_orig, 0.0)
else
JuMP.@objective(m.model_mip, m.sense_orig, eval(m.obj_expr_orig))
end
elseif m.obj_structure == :affine
JuMP.@objective(
m.model_mip,
Expand Down Expand Up @@ -235,7 +239,6 @@ function add_partition(m::Optimizer; kwargs...)
point_vec = m.best_bound_sol

if isa(Alp.get_option(m, :disc_add_partition_method), Function)
# m.discretization = eval(Alp.get_option(m, :disc_add_partition_method))(m, use_disc=discretization, use_solution=point_vec)
m.discretization = Alp.get_option(m, :disc_add_partition_method)(
m,
use_disc = discretization,
Expand Down Expand Up @@ -263,7 +266,7 @@ end

A built-in method used to add a new partition on feasible domains of variables chosen for partitioning.

This can be illustrated by the following example. Let the previous iteration's partition vector on
This can be illustrated by the following example. Let the previous iteration's partition vector on
variable "x" be given by [0, 3, 7, 9]. And say, the lower bounding solution has a value of 4 for variable "x".
In the case when `partition_scaling_factor = 4`, this function creates the new partition vector as follows: [0, 3, 3.5, 4, 4.5, 7, 9]

Expand Down Expand Up @@ -444,8 +447,7 @@ function update_partition_scaling_factor(m::Optimizer, presolve = false)
m.logs[:n_iter] > 2 && return Alp.get_option(m, :partition_scaling_factor) # Stop branching after the second iterations

ratio_pool = [8:2:20;] # Built-in try range
convertor = Dict(MOI.MAX_SENSE => :<, MOI.MIN_SENSE => :>)
# revconvertor = Dict(MOI.MAX_SENSE => :>, MOI.MIN_SENSE => :<)
is_tighter = ifelse(m.sense_orig == MOI.MAX_SENSE, <, >)

incumb_ratio = ratio_pool[1]
Alp.is_min_sense(m) ? incumb_res = -Inf : incumb_res = Inf
Expand All @@ -472,7 +474,7 @@ function update_partition_scaling_factor(m::Optimizer, presolve = false)
Alp.create_bounding_mip(m, use_disc = branch_disc)
res = Alp.disc_branch_solve(m)
push!(res_collector, res)
if eval(convertor[m.sense_orig])(res, incumb_res) # && abs(abs(collector[end]-res)/collector[end]) > 1e-1 # %1 of difference
if is_tighter(res, incumb_res) # && abs(abs(collector[end]-res)/collector[end]) > 1e-1 # %1 of difference
incumb_res = res
incumb_ratio = r
end
Expand Down
18 changes: 11 additions & 7 deletions src/log.jl
Original file line number Diff line number Diff line change
Expand Up @@ -123,16 +123,16 @@ function logging_summary(m::Optimizer)
end

function logging_head(m::Optimizer)
if Alp.is_min_sense(m)
printstyled("LOWER-BOUNDING ITERATIONS", color = :cyan, bold = true)
UB_iter = "Incumbent"
UB = "Best Incumbent"
LB = "Lower Bound"
elseif Alp.is_max_sense(m)
if Alp.is_max_sense(m)
printstyled("UPPER-BOUNDING ITERATIONS", color = :cyan, bold = true)
UB_iter = "Incumbent"
UB = "Best Incumbent"
LB = "Upper Bound"
else
printstyled("LOWER-BOUNDING ITERATIONS", color = :cyan, bold = true)
UB_iter = "Incumbent"
UB = "Best Incumbent"
LB = "Lower Bound"
end
println(
"\n====================================================================================================",
Expand All @@ -159,7 +159,11 @@ function logging_row_entry(m::Optimizer; kwargs...)
UB_block = string(" ", objstr, " "^spc)

if expr_isconst(m.obj_expr_orig)
bdstr = eval(m.obj_expr_orig)
bdstr = if m.obj_expr_orig == :(+())
0.0
else
eval(m.obj_expr_orig)
end
spc = b_len - length(bdstr)
elseif isa(m.logs[:bound][end], Float64)
bdstr = string(round(m.logs[:bound][end]; digits = 4))
Expand Down
6 changes: 2 additions & 4 deletions src/main_algorithm.jl
Original file line number Diff line number Diff line change
Expand Up @@ -308,7 +308,6 @@ function check_exit(m::Optimizer)
# constant objective with feasible local solve check
if Alp.expr_isconst(m.obj_expr_orig) &&
(m.status[:local_solve] == MOI.OPTIMAL || m.status == MOI.LOCALLY_SOLVED)
# m.best_bound = eval(m.obj_expr_orig)
m.best_bound = m.obj_expr_orig
m.best_rel_gap = 0.0
m.best_abs_gap = 0.0
Expand Down Expand Up @@ -522,7 +521,7 @@ It solves the problem built upon a piecewise convexification based on the discre
See `create_bounding_mip` for more details of the problem solved here.
"""
function bounding_solve(m::Optimizer)
convertor = Dict(MOI.MAX_SENSE => :<, MOI.MIN_SENSE => :>)
is_tighter = ifelse(m.sense_orig == MOI.MAX_SENSE, <, >)

# Updates time metric and the termination bounds
Alp.set_mip_time_limit(m)
Expand Down Expand Up @@ -554,7 +553,7 @@ function bounding_solve(m::Optimizer)
)+1] = copy(candidate_bound_sol) # Requires proper offseting
end
push!(m.logs[:bound], candidate_bound)
if eval(convertor[m.sense_orig])(candidate_bound, m.best_bound)
if is_tighter(candidate_bound, m.best_bound)
m.best_bound = candidate_bound
m.best_bound_sol = copy(candidate_bound_sol)
m.status[:bounding_solve] = status
Expand Down Expand Up @@ -600,7 +599,6 @@ function pick_disc_vars(m::Optimizer)
disc_var_pick = Alp.get_option(m, :disc_var_pick)

if isa(disc_var_pick, Function)
# eval(Alp.get_option(m, :disc_var_pick))(m)
disc_var_pick(m)
length(m.disc_vars) == 0 &&
length(m.nonconvex_terms) > 0 &&
Expand Down
23 changes: 11 additions & 12 deletions src/multilinear.jl
Original file line number Diff line number Diff line change
Expand Up @@ -323,18 +323,17 @@ end

function amp_warmstart_α(m::Optimizer, α::Dict)
d = m.discretization

is_better = ifelse(m.sense_orig == MOI.MIN_SENSE, <, >)
if m.bound_sol_pool[:cnt] >= 2 # can only warm-start the problem when pool is large enough
ws_idx = -1
Alp.is_min_sense(m) ? ws_obj = Inf : ws_obj = -Inf
comp_opr = Dict(MOI.MIN_SENSE => :<, MOI.MAX_SENSE => :>)

# Search for the pool for incumbent warm starter
for i in 1:m.bound_sol_pool[:cnt]
m.bound_sol_pool[:stat][i] == :Warmstarter &&
(m.bound_sol_pool[:stat][i] = :Alive) # reset the status if not dead
if m.bound_sol_pool[:stat][i] != :Dead &&
eval(comp_opr[m.sense_orig])(m.bound_sol_pool[:obj][i], ws_obj)
is_better(m.bound_sol_pool[:obj][i], ws_obj)
ws_idx = i
ws_obj = m.bound_sol_pool[:obj][i]
end
Expand Down Expand Up @@ -625,21 +624,21 @@ end
_add_multilinear_linking_constraints(m::Optimizer, λ::Dict)

This internal function adds linking constraints between λ multipliers corresponding to multilinear terms
that share more than two variables and are partitioned. For example, suppose we have λ[i], λ[j], and
λ[k] where i=(1,2,3), j=(1,2,4), and k=(1,2,5). λ[i] contains all multipliers for the extreme points
that share more than two variables and are partitioned. For example, suppose we have λ[i], λ[j], and
λ[k] where i=(1,2,3), j=(1,2,4), and k=(1,2,5). λ[i] contains all multipliers for the extreme points
in the space of (x1,x2,x3). λ[j] contains all multipliers for the extreme points in the space of (x1,x2,x4).
λ[k] contains all multipliers for the extreme points in the space of (x1,x2,x5).

Using λ[i], λ[j], or λ[k], we can express multilinear function x1*x2.
We define a linking variable μ(1,2) that represents the value of x1*x2.
Linking constraints are
μ(1,2) == convex combination expr for x1*x2 using λ[i],
μ(1,2) == convex combination expr for x1*x2 using λ[j], and
μ(1,2) == convex combination expr for x1*x2 using λ[k].
μ(1,2) == convex combination expr for x1*x2 using λ[i],
μ(1,2) == convex combination expr for x1*x2 using λ[j], and
μ(1,2) == convex combination expr for x1*x2 using λ[k].

Thus, these constraints link between λ[i], λ[j], and λ[k] variables.

Reference: J. Kim, J.P. Richard, M. Tawarmalani, Piecewise Polyhedral Relaxations of Multilinear Optimization,
Reference: J. Kim, J.P. Richard, M. Tawarmalani, Piecewise Polyhedral Relaxations of Multilinear Optimization,
http://www.optimization-online.org/DB_HTML/2022/07/8974.html
"""
function _add_multilinear_linking_constraints(m::Optimizer, λ::Dict)
Expand Down Expand Up @@ -684,8 +683,8 @@ end
"""
_get_shared_multilinear_terms_info(λ, linking_constraints_degree_limit)

This function checks to see if linking constraints are
necessary for a given vector of each multilinear terms and returns the approapriate
This function checks to see if linking constraints are
necessary for a given vector of each multilinear terms and returns the approapriate
linking constraints information.
"""
function _get_shared_multilinear_terms_info(
Expand All @@ -700,7 +699,7 @@ function _get_shared_multilinear_terms_info(
return (linking_constraints_info = nothing)
end

# Limit the linking constraints to a prescribed multilinear degree
# Limit the linking constraints to a prescribed multilinear degree
if !isnothing(linking_constraints_degree_limit) &&
(linking_constraints_degree_limit < max_degree)
max_degree = linking_constraints_degree_limit
Expand Down
47 changes: 24 additions & 23 deletions src/presolve.jl
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,6 @@ function bound_tightening_wrapper(m::Optimizer; use_bound = true, kwargs...)
elseif Alp.get_option(m, :presolve_bt_algo) == 2
Alp.optimization_based_bound_tightening(m, use_bound = use_bound, use_tmc = true)
elseif isa(Alp.get_option(m, :presolve_bt_algo), Function)
# eval(Alp.get_option(m, :presolve_bt_algo))(m)
Alp.get_option(m, :presolve_bt_algo)(m)
else
error("Unrecognized bound tightening algorithm")
Expand Down Expand Up @@ -219,7 +218,7 @@ function optimization_based_bound_tightening(
bound_max_reduction = (max_reduction > improv_tol)
bound_max_width = (max_width > width_tol)

# Deactivate this termination criterion if it slows down the OBBT convergence
# Deactivate this termination criterion if it slows down the OBBT convergence
stats = Alp.relaxation_model_obbt(m, discretization, bound)
if Alp.is_min_sense(m)
current_rel_gap = Alp.eval_opt_gap(m, stats["relaxed_obj"], bound)
Expand Down Expand Up @@ -307,19 +306,16 @@ end

function relaxation_model_obbt(m::Optimizer, discretization, bound::Number)
Alp.create_obbt_model(m, discretization, bound)

obj_expr = sum(
m.bounding_obj_mip[:coefs][j] *
_index_to_variable_ref(m.model_mip, m.bounding_obj_mip[:vars][j].args[2]) for
j in 1:m.bounding_obj_mip[:cnt]
sense = Alp.is_max_sense(m) ? MOI.MAX_SENSE : MOI.MIN_SENSE
JuMP.@objective(
m.model_mip,
sense,
sum(
m.bounding_obj_mip[:coefs][j] *
_index_to_variable_ref(m.model_mip, m.bounding_obj_mip[:vars][j].args[2]) for
j in 1:m.bounding_obj_mip[:cnt]
),
)

if Alp.is_min_sense(m)
JuMP.@objective(m.model_mip, Min, obj_expr)
elseif Alp.is_max_sense(m)
JuMP.@objective(m.model_mip, Max, obj_expr)
end

return Alp.solve_obbt_model(m)
end

Expand Down Expand Up @@ -354,7 +350,11 @@ function solve_obbt_model(m::Optimizer; kwargs...)
status = MOI.get(m.model_mip, MOI.TerminationStatus())

stats["status"] = status
stats["relaxed_obj"] = JuMP.objective_value(m.model_mip)
stats["relaxed_obj"] = try
JuMP.objective_value(m.model_mip)
catch
NaN
end

cputime_solve = time() - start_solve
m.logs[:total_time] += cputime_solve
Expand All @@ -367,18 +367,19 @@ end
"""
post_objective_bound(m::Optimizer, bound::Float64; kwargs...)

This function adds the upper/lower bounding constraint on the objective function
for the optimization models solved within the OBBT algorithm.
This function adds the upper/lower bounding constraint on the objective function
for the optimization models solved within the OBBT algorithm.
"""
function post_objective_bound(m::Optimizer, bound::Number; kwargs...)
obj_expr = sum(
m.bounding_obj_mip[:coefs][j] *
_index_to_variable_ref(m.model_mip, m.bounding_obj_mip[:vars][j].args[2]) for
j in 1:m.bounding_obj_mip[:cnt]
obj_expr = JuMP.@expression(
m.model_mip,
sum(
m.bounding_obj_mip[:coefs][j] *
_index_to_variable_ref(m.model_mip, m.bounding_obj_mip[:vars][j].args[2]) for
j in 1:m.bounding_obj_mip[:cnt]
),
)

obj_bound_tol = Alp.get_option(m, :presolve_bt_obj_bound_tol)

if Alp.is_max_sense(m)
JuMP.@constraint(
m.model_mip,
Expand Down
6 changes: 3 additions & 3 deletions src/utility.jl
Original file line number Diff line number Diff line change
Expand Up @@ -114,9 +114,9 @@ discretization_to_bounds(d::Dict, l::Int) = Alp.update_var_bounds(d, len = l)
Update the data structure with feasible solution and its associated objective (if better)
"""
function update_incumbent(m::Optimizer, objval::Float64, sol::Vector)
convertor = Dict(MOI.MAX_SENSE => :>, MOI.MIN_SENSE => :<)
push!(m.logs[:obj], objval)
if eval(convertor[m.sense_orig])(objval, m.best_obj) #&& !eval(convertor[m.sense_orig])(objval, m.best_bound)
is_better = ifelse(m.sense_orig == MOI.MAX_SENSE, >, <)
if is_better(objval, m.best_obj)
m.best_obj = objval
m.best_sol = sol
m.detected_incumbent = true
Expand Down Expand Up @@ -535,7 +535,7 @@ function resolve_lifted_var_value(m::Optimizer, sol_vec::Array)
return sol_vec
end

# Unused functions
# Unused functions
# function amp_post_λ_upperbound(
# m::Optimizer,
# λ::Dict,
Expand Down
17 changes: 17 additions & 0 deletions test/test_solver.jl
Original file line number Diff line number Diff line change
Expand Up @@ -342,3 +342,20 @@ end
alpine = JuMP.backend(m).optimizer.model
@test !(:Hess in Alpine.features_available(alpine))
end

@testset "FEASIBILITY_PROBLEM" begin
model = JuMP.Model(
JuMP.optimizer_with_attributes(
Alpine.Optimizer,
"nlp_solver" => IPOPT,
"mip_solver" => HIGHS,
"minlp_solver" => JUNIPER,
),
)
JuMP.@variable(model, x[1:3], Bin)
JuMP.@NLconstraint(model, prod(1 + x[i] for i in 1:3) <= 2)
JuMP.@constraint(model, sum(x[i] for i in 1:3) >= 1)
JuMP.optimize!(model)
# TODO(odow): why is this infeasible?
@test JuMP.termination_status(model) isa JuMP.MOI.TerminationStatusCode
Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think so, although I wonder if I should add an objective to this test case. The feasibility issue is probably related to the FEASIBILITY_SENSE issue.

end
Loading