-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement generating alternatives using metaheuristics #13
Changes from 8 commits
b0d6a4e
5e38332
f7c4eeb
3f5ee41
e8073e8
e46b081
20598dd
ccbf958
6e26bd5
a2daa1d
4e2c16d
7ed1ba7
8f9702b
File filter
Filter by extension
Conversations
Jump to
Diff view
Diff view
There are no files selected for viewing
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -32,7 +32,7 @@ jobs: | |
fail-fast: false | ||
matrix: | ||
version: | ||
- "1.6" | ||
- "1.7" | ||
- "1" | ||
os: | ||
- ubuntu-latest | ||
|
Original file line number | Diff line number | Diff line change |
---|---|---|
|
@@ -4,12 +4,15 @@ authors = ["Matthijs Arnoldus <[email protected]> and contributors"] | |
version = "0.1.0" | ||
|
||
[deps] | ||
DataStructures = "864edb3b-99cc-5e75-8d2d-829cb0a9cfe8" | ||
Distances = "b4f34e82-e78d-54a5-968a-f98e89d6e8f7" | ||
JuMP = "4076af6c-e467-56ae-b986-b466b2749572" | ||
MathOptInterface = "b8f27783-ece8-5eb3-8dc8-9495eed66fee" | ||
Metaheuristics = "bcdb8e00-2c21-11e9-3065-2b553b22f898" | ||
|
||
[compat] | ||
JuMP = "1" | ||
MathOptInterface = "1" | ||
Distances = "0.10" | ||
julia = "1.6" | ||
julia = "1.7" | ||
Metaheuristics = "3.3" |
Original file line number | Diff line number | Diff line change | ||||
---|---|---|---|---|---|---|
@@ -0,0 +1,326 @@ | ||||||
""" | ||||||
Structure representing a problem that can be solved by Metaheuristics.jl and the algorithm to solve it. | ||||||
""" | ||||||
mutable struct MetaheuristicProblem | ||||||
objective::Function | ||||||
bounds::Matrix{Float64} | ||||||
algorithm::Metaheuristics.Algorithm | ||||||
end | ||||||
|
||||||
""" | ||||||
constraint = extract_constraint( | ||||||
constraint::MOI.ConstraintFunction, | ||||||
x::Vector{Float64}, | ||||||
index_map::Dict{Int64, Int64}, | ||||||
fixed_variables::Dict{MOI.VariableIndex, Float64} | ||||||
) | ||||||
|
||||||
Convert a constraint from a MathOptInterface function into a julia function of x. Supports only ScalarAffineFunction and VariableIndex constraints. | ||||||
|
||||||
# Arguments | ||||||
- `constraint::MOI.ConstraintFunction`: constraint transform into a julia function. | ||||||
- `x::Vector{Float64}`: a vector representing an individual in the metaheuristic population. | ||||||
- `index_map::Dict{Int64, Int64}`: a dictionary mapping indices in the MathOptInterface model to indices of `x`. | ||||||
- `fixed_variables::Dict{MOI.VariableIndex, Float64}: a dictionary containing the values of the fixed variables.` | ||||||
""" | ||||||
function extract_constraint( | ||||||
constraint::MOI.ScalarAffineFunction, | ||||||
x::Vector{Float64}, | ||||||
index_map::Dict{Int64, Int64}, | ||||||
fixed_variables::Dict{MOI.VariableIndex, Float64}, | ||||||
) | ||||||
result = 0 | ||||||
for t in constraint.terms | ||||||
if haskey(index_map, t.variable.value) | ||||||
result += t.coefficient * x[index_map[t.variable.value]] | ||||||
else | ||||||
result += t.coefficient * fixed_variables[t.variable] | ||||||
end | ||||||
end | ||||||
return result | ||||||
end | ||||||
|
||||||
""" | ||||||
objective = extract_objective( | ||||||
objective::JuMP.AffExpr, | ||||||
x::Vector{Float64}, | ||||||
index_map::Dict{Int64, Int64}, | ||||||
fixed_variables::Dict{MOI.VariableIndex, Float64} | ||||||
) | ||||||
|
||||||
Convert the objective from a MathOptInterface function into a julia function of x. Supports only linear single-objective functions. | ||||||
|
||||||
# Arguments | ||||||
- `objective::JuMP.AffExpr`: the objective function to transform into a julia function. | ||||||
- `x::Vector{Float64}`: a vector representing an individual in the metaheuristic population. | ||||||
- `index_map::Dict{Int64, Int64}`: a dictionary mapping indices in the MathOptInterface model to indices of `x`. | ||||||
- `fixed_variables::Dict{MOI.VariableIndex, Float64}: a dictionary containing the values of the fixed variables.` | ||||||
""" | ||||||
function extract_objective( | ||||||
objective::JuMP.AffExpr, | ||||||
x::Vector{Float64}, | ||||||
index_map::Dict{Int64, Int64}, | ||||||
fixed_variables::Dict{MOI.VariableIndex, Float64}, | ||||||
) | ||||||
result = 0 | ||||||
# Add the constant in the objective function. | ||||||
result += objective.constant | ||||||
# Add all terms in the objective function with variables iteratively. | ||||||
for (var, coef) in objective.terms | ||||||
# If variable in `index_map`, add corresponding value to the result. Else, the variable is fixed and add the original resulting variable. | ||||||
if haskey(index_map, var.index.value) | ||||||
result += coef * x[index_map[var.index.value]] | ||||||
else | ||||||
result += coef * fixed_variables[var.index] | ||||||
end | ||||||
end | ||||||
return result | ||||||
end | ||||||
|
||||||
""" | ||||||
objective = create_objective( | ||||||
model::JuMP.Model, | ||||||
solution::OrderedDict{JuMP.VariableRef, Float64}, | ||||||
optimality_gap::Float64, | ||||||
metric::Distances.SemiMetric, | ||||||
index_map::Dict{Int64, Int64}, | ||||||
fixed_variables::Dict{VariableRef, Float64} | ||||||
) | ||||||
|
||||||
Create an objective function supported by Metaheuristics.jl for the alternative generating problem. | ||||||
|
||||||
# Arguments | ||||||
- `model::JuMP.Model`: solved JuMP model of the original lp problem. | ||||||
- `solution::OrderedDict{JuMP.VariableRef, Float64}`: solution value of the original lp problem excluding fixed variables. | ||||||
- `optimality_gap::Float64`: maximum difference between objective value of optimal solution and alternative solutions. | ||||||
- `metric::Distances.SemiMetric`: distance metric used to measure distance between solutions. | ||||||
- `index_map::Dict{Int64, Int64}`: dictionary mapping indices in the JuMP/MathOptInterface model to indices of `x`. | ||||||
- `fixed_variables::Dict{VariableRef, Float64}`: dictionary containing the values of the fixed variables. | ||||||
""" | ||||||
function create_objective( | ||||||
model::JuMP.Model, | ||||||
solution::OrderedDict{JuMP.VariableRef, Float64}, | ||||||
optimality_gap::Float64, | ||||||
metric::Distances.SemiMetric, | ||||||
index_map::Dict{Int64, Int64}, | ||||||
fixed_variables::Dict{MOI.VariableIndex, Float64}, | ||||||
) | ||||||
original_objective = JuMP.objective_function(model) | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
Suggested change
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Will be resolved in a separate issue |
||||||
solution_values = collect(Float64, values(solution)) | ||||||
# Compute objective value or original LP. | ||||||
greg-neustroev marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||
original_objective_value = | ||||||
extract_objective(original_objective, solution_values, index_map, fixed_variables) | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. can you just use something like objective_value(model)? Not fully clear why this new function. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This can probably indeed be done in a simpler manner, I will investigate |
||||||
# Obtain all constraints from model (including variable bounds). | ||||||
constraints = JuMP.all_constraints(model, include_variable_in_set_constraints = true) | ||||||
constraint_functions = map(c -> MOI.get(model, MOI.ConstraintFunction(), c), constraints) | ||||||
constraint_sets = map(c -> MOI.get(model, MOI.ConstraintSet(), c), constraints) | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. what's a constraint function, what's a constraint set? Better comment on their definitions, e.g., with an example There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. These are defined as such in MOI documentation. What MOI.ConstraintFunction and MOI.ConstraintSet are is still not clear to me. Maybe you can add a comment explaining this, @marnoldus There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This link explains them best: https://jump.dev/MathOptInterface.jl/stable/manual/constraints/ |
||||||
|
||||||
function f(x) | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. better name? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I agree, this should be a better name even if this is the name they use in the Metaheuristics package. |
||||||
# Objective function for metaheuristic (= distance between individual x and solution values of original LP). Solution_values does not contain fixed_variables, these are not required in objective as the distance for these variables is zero. | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. is this (i.e., only using non-fixed variables) also true for optimization? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. It is, it's done in line 31: fix.(fixed_variables, value.(fixed_variables), force = true) |
||||||
fx = [-Distances.evaluate(metric, x, solution_values)] | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. better name? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. yes, Metahuristics.jl always minimizes. Should add a comment to explain that |
||||||
# Initialise set of inequality constraints and add objective gap constraint. | ||||||
greg-neustroev marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||
gx = Vector{Float64}(undef, 0) | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. better name? I realized using names such as fx, gx, hx are norms in math (?) and in Metaheuristics.jl, but maybe in the code it is not a bad idea to make them more explicit. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I agree. We should refactor |
||||||
# Add objective gap constraint depending on whether original LP is maximised or minimised. | ||||||
if JuMP.objective_sense(model) == JuMP.MAX_SENSE | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Can we pull this into a separate function, so that it can be reused in this file and in alternative-optimization.jl? @marnoldus ? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. Adding constraints works very different for Metaheuristics (adding to array gx) or JuMP (@constraint), so I think it's not much cleaner when done in a separate function as the bodies of the if and else are very different There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I suspected so, but the problem now is that if you want to change the formula, you need to change it in two places, although it is the same formula. We can think about how to unify these places in code in a separate issue. |
||||||
push!( | ||||||
gx, | ||||||
original_objective_value * (1 - optimality_gap * sign(original_objective_value)) - | ||||||
extract_objective(original_objective, x, index_map, fixed_variables), | ||||||
) | ||||||
else | ||||||
push!( | ||||||
gx, | ||||||
extract_objective(original_objective, x, index_map, fixed_variables) - | ||||||
original_objective_value * (1 + optimality_gap * sign(original_objective_value)), | ||||||
) | ||||||
end | ||||||
# Initialise set of equality constraints. | ||||||
hx = Vector{Float64}(undef, 0) | ||||||
|
||||||
for i in eachindex(constraint_functions) | ||||||
c_fun = constraint_functions[i] | ||||||
c_set = constraint_sets[i] | ||||||
|
||||||
# Check if constraint involves multiple variables or if it is a bound. | ||||||
if isa(c_fun, MathOptInterface.ScalarAffineFunction) | ||||||
# Add constraint to gx or hx, depending on equality or inequality. | ||||||
if isa(c_set, MOI.LessThan) | ||||||
resulting_constraint = | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. maybe it's just stupid me who is not yet familiar with JuMP/MOI inners, but could you add some examples in the comments to make it more straightforward to see what you're trying to achieve? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This can use some explanations indeed |
||||||
extract_constraint(c_fun, x, index_map, fixed_variables) - MOI.constant(c_set) | ||||||
push!(gx, resulting_constraint) | ||||||
elseif isa(c_set, MOI.GreaterThan) | ||||||
resulting_constraint = | ||||||
MOI.constant(c_set) - extract_constraint(c_fun, x, index_map, fixed_variables) | ||||||
push!(gx, resulting_constraint) | ||||||
elseif isa(c_set, MOI.EqualTo) | ||||||
resulting_constraint = | ||||||
extract_constraint(c_fun, x, index_map, fixed_variables) - MOI.constant(c_set) | ||||||
push!(hx, resulting_constraint) | ||||||
elseif isa(c_set, MOI.Interval) | ||||||
constraint = extract_constraint(c_fun, x, index_map, fixed_variables) | ||||||
push!(gx, constraint - c_set.upper) | ||||||
push!(gx, c_set.lower - constraint) | ||||||
end | ||||||
elseif isa(c_fun, MathOptInterface.VariableIndex) | ||||||
# Skip variable if it is fixed. | ||||||
if !haskey(index_map, c_fun.value) | ||||||
continue | ||||||
end | ||||||
# Add bounds to gx or hx, depending on equality or inequality. | ||||||
if isa(c_set, MOI.LessThan) | ||||||
push!(gx, x[index_map[c_fun.value]] - MOI.constant(c_set)) | ||||||
elseif isa(c_set, MOI.GreaterThan) | ||||||
push!(gx, MOI.constant(c_set) - x[index_map[c_fun.value]]) | ||||||
elseif isa(c_set, MOI.EqualTo) | ||||||
push!(hx, x[index_map[c_fun.value]] - MOI.constant(c_set)) | ||||||
elseif isa(c_set, MOI.Interval) | ||||||
push!(gx, x[index_map[c_fun.value]] - c_set.upper) | ||||||
push!(gx, c_set.lower - x[index_map[c_fun.value]]) | ||||||
end | ||||||
else | ||||||
throw(ArgumentError("Only linear non-vector constraints are supported.")) | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. no quadractic function? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. If it is possible to do this on a higher level, maybe it can be abstracted away. I'll investigate if MOI has methods for doing this in a more automated way |
||||||
end | ||||||
end | ||||||
|
||||||
# hx should contain 0.0 if no equality constraints extract_constraint | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. what about gx? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. we can try a test with only equality constraints and see if this fails for fx. @marnoldus tagging you in case you know the answer There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. That's a mistake on my side, gx should also be [0.0] if no constraints are there. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. @marnoldus could you fix this in this PR? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. I fixed it. Just @ me again if I am needed elsewhere, but I probably won't respond again until wednesday or thursday |
||||||
if isempty(hx) | ||||||
push!(hx, 0.0) | ||||||
end | ||||||
marnoldus marked this conversation as resolved.
Show resolved
Hide resolved
|
||||||
|
||||||
return fx, gx, hx | ||||||
end | ||||||
|
||||||
return f | ||||||
end | ||||||
|
||||||
""" | ||||||
bounds = extract_bounds( | ||||||
model::JuMP.Model, | ||||||
index_map::Dict{Int64, Int64} | ||||||
) | ||||||
|
||||||
Transform the bounds from a JuMP Model into a matrix of bounds readable by Metaheuristics.jl. | ||||||
|
||||||
# Arguments | ||||||
- `model::JuMP.Model`: solved JuMP model of the original lp problem. | ||||||
- `index_map::Dict{Int64, Int64}`: dictionary mapping indices in the JuMP/MathOptInterface model to indices of `x`. | ||||||
""" | ||||||
function extract_bounds(model::JuMP.Model, index_map::Dict{Int64, Int64}) | ||||||
# Initialise bound matrix with all variables between -Inf and Inf. | ||||||
n_variables = length(index_map) | ||||||
bounds = zeros(Float64, (2, n_variables)) | ||||||
for i = 1:n_variables | ||||||
bounds[1, i] = -Inf | ||||||
bounds[2, i] = Inf | ||||||
end | ||||||
|
||||||
# Obtain all constraints from the model. | ||||||
constraints = all_constraints(model, include_variable_in_set_constraints = true) | ||||||
|
||||||
for c in constraints | ||||||
c_fun = MOI.get(model, MOI.ConstraintFunction(), c) | ||||||
# Check if constraint is a bound. In that case add upper and lower bounds depending on type of constraint. | ||||||
if isa(c_fun, MOI.VariableIndex) && haskey(index_map, c_fun.value) | ||||||
c_set = MOI.get(model, MOI.ConstraintSet(), c) | ||||||
if isa(c_set, MOI.LessThan) | ||||||
bounds[2, index_map[c_fun.value]] = MOI.constant(c_set) | ||||||
elseif isa(c_set, MOI.GreaterThan) | ||||||
bounds[1, index_map[c_fun.value]] = MOI.constant(c_set) | ||||||
elseif isa(c_set, MOI.EqualTo) | ||||||
bounds[1, index_map[c_fun.value]] = MOI.constant(c_set) | ||||||
bounds[2, index_map[c_fun.value]] = MOI.constant(c_set) | ||||||
|
||||||
elseif isa(c_set, MOI.Interval) | ||||||
bounds[1, index_map[c_fun.value]] = c_set.lower | ||||||
bounds[2, index_map[c_fun.value]] = c_set.upper | ||||||
end | ||||||
end | ||||||
end | ||||||
|
||||||
return bounds | ||||||
end | ||||||
|
||||||
""" | ||||||
problem = create_alternative_generating_problem( | ||||||
model::JuMP.Model, | ||||||
algorithm::Metaheuristics.Algorithm, | ||||||
initial_solution::OrderedDict{VariableRef, Float64}, | ||||||
optimality_gap::Float64, | ||||||
metric::Distances.SemiMetric, | ||||||
fixed_variables::Dict{VariableRef, Float64} | ||||||
) | ||||||
|
||||||
Create the Metaheuristic problem representing the alternative generating problem for the original LP. | ||||||
|
||||||
# Arguments: | ||||||
- `model::JuMP.Model`: JuMP model representing the original LP. | ||||||
- `algorithm::Metaheuristics.Algorithm`: Metaheuristic algorithm to solve the alternative generating problem. | ||||||
- `initial_solution::OrderedDict{VariableRef, Float64}`: (near-)optimal solution to `model`, for which alternatives are sought. | ||||||
- `optimality_gap::Float64`: maximum gap in objective value between `initial_solution` and alternative solutions. | ||||||
- `metric::Distances.SemiMetric`: distance metric used to compute distance between alternative solutions and `initial_solution`. | ||||||
- `fixed_variables::Dict{MOI.VariableIndex, Float64}`: solution values for fixed variables of the original problem. | ||||||
""" | ||||||
function create_alternative_generating_problem( | ||||||
model::JuMP.Model, | ||||||
algorithm::Metaheuristics.Algorithm, | ||||||
initial_solution::OrderedDict{VariableRef, Float64}, | ||||||
optimality_gap::Float64, | ||||||
metric::Distances.SemiMetric, | ||||||
fixed_variables::Dict{MOI.VariableIndex, Float64}, | ||||||
) | ||||||
# Create map between variable indices in JuMP model and indices in individuals of Metaheuristic problem. | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. is mapping really needed? I can see a point, but it complicates things a bit in the thought process, and I'm sure if it is really needed. There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is converting JuMP indices to MOI indices. Maybe we can use JuMP indices directly or maybe this is implemented in MOI ? |
||||||
index_map = Dict{Int64, Int64}() | ||||||
k = collect(keys(initial_solution)) | ||||||
for i in eachindex(k) | ||||||
index_map[k[i].index.value] = i | ||||||
end | ||||||
|
||||||
objective = | ||||||
create_objective(model, initial_solution, optimality_gap, metric, index_map, fixed_variables) | ||||||
bounds = extract_bounds(model, index_map) | ||||||
# Possible TODO: Initialise initial_population | ||||||
|
||||||
return MetaheuristicProblem(objective, bounds, algorithm) | ||||||
end | ||||||
|
||||||
""" | ||||||
add_solution!( | ||||||
problem::MetaheuristicProblem, | ||||||
result::Metaheuristics.State, | ||||||
metric::Distances.SemiMetric | ||||||
) | ||||||
|
||||||
Modify a Metaheuristic problem representing the alternative generating problem for the original LP using a newly found alternative solution. This function can be used when one wants to iteratively run a metaheuristic to find alternative solutions one by one. | ||||||
|
||||||
# Arguments: | ||||||
- `problem::MetaheuristicProblem`: problem to be modified by adding a solution. | ||||||
- `result::Metaheuristics.State`: result containing the optimal solution to add to the objective function. | ||||||
- `metric::Distances.SemiMetric`: metric used to evaluate distance between alternatives. | ||||||
""" | ||||||
function add_solution!( | ||||||
problem::MetaheuristicProblem, | ||||||
result::Metaheuristics.State, | ||||||
metric::Distances.SemiMetric, | ||||||
) | ||||||
# Create new objective function using old objective function. | ||||||
objective = problem.objective | ||||||
solution = minimizer(result) | ||||||
function f(x) | ||||||
f_old, gx, hx = objective(x) | ||||||
fx = [f_old[1] - Distances.evaluate(metric, solution, x)] | ||||||
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. shouldn't this be + instead of - if i compare with l69 in NearOptimalAlternatives.jl/src/alternative-optimisation.jl at 2-mga-metaheuristics · TulipaEnergy/NearOptimalAlternatives.jl (github.com)? And do we need to say max somewhere? There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more. This is because in NearOptimalAlternatives.jl/src/alternative-optimisation.jl we are maximizing, but in Metaheuristics.jl we are always minimizing. I do agree that this introduces an inconsistency, though. Maybe we should be minimizing there. |
||||||
return fx, gx, hx | ||||||
end | ||||||
problem.objective = f | ||||||
end | ||||||
|
||||||
""" | ||||||
result = run_alternative_generating_problem!( | ||||||
problem::MetaheuristicProblem | ||||||
) | ||||||
|
||||||
Optimise the `problem` using the specified metaheuristic algorithm and return the result. | ||||||
""" | ||||||
function run_alternative_generating_problem!(problem::MetaheuristicProblem) | ||||||
result = Metaheuristics.optimize(problem.objective, problem.bounds, problem.algorithm) | ||||||
return result | ||||||
end |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Calculating the value (not sure about this term) of a constraint instead of transforming a constraint? e.g., this function would return 3 when x=[1,2] for x_1 + x_2 >= 1; according to l27, it only supports ScalarAffineFunction?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This calculates the value of a scalar affine function that represents a constraint; it is based on the data in MOI.ScalarAffineFunction. Maybe there is an easier way to do that, e.g., a functioned called something like "apply constraint"? I'll investigate
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would VectorAffineFunction work faster? I'll investigate