-
Notifications
You must be signed in to change notification settings - Fork 2
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Implement generating alternatives using metaheuristics #13
Conversation
@abelsiqueira as you can see this pull request fails on Julia 1.6. The reason for this is that the Metaheuristics package I use requires a minimum Julia version of 1.7. Using an earlier version of Metaheuristics.jl gives errors in the current code. What do you think is the best way to resolve this? Should we add a compatibility requirement in the project.toml that julia should be at least 1.7? Then we should also update the continuous integration. |
Hi @marnoldus, there are two possibilities:
Solution 1 is simpler and straightforward. I prefer it. Solution 2 involves too many moving parts and is hard to maintain. It requires manual intervention when the LTS version of Julia changes, and it will create the false expectation that the package works normally on 1.6, which is not the case. For me it's a clear choice to drop 1.6, but I'm tagging @clizbe for approval. |
Solution 1 sounds better. It just means the rest of Tulipa requires Julia 1.6 or later, while MGA requires Julia 1.7 or later, right? |
Yes |
Thanks for all the work! I added some comments while reading the code for the first time. Most of them are about clarification from an outsider point of view, Although I can understand most after some googling/testing, but it would be nice if there can be more comments so I don't have to jump between things 😄 . Not sure if you're still around to follow this up @marnoldus but I tag you both so you are both informed @greg-neustroev. And there is an issue #22 with my comments for the existing code. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Here's my review.
fixed_variables::Dict{MOI.VariableIndex, Float64} | ||
) | ||
|
||
Convert a constraint from a MathOptInterface function into a julia function of x. Supports only ScalarAffineFunction and VariableIndex constraints. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Calculating the value (not sure about this term) of a constraint instead of transforming a constraint? e.g., this function would return 3 when x=[1,2] for x_1 + x_2 >= 1; according to l27, it only supports ScalarAffineFunction?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This calculates the value of a scalar affine function that represents a constraint; it is based on the data in MOI.ScalarAffineFunction. Maybe there is an easier way to do that, e.g., a functioned called something like "apply constraint"? I'll investigate
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
would VectorAffineFunction work faster? I'll investigate
index_map::Dict{Int64, Int64}, | ||
fixed_variables::Dict{MOI.VariableIndex, Float64}, | ||
) | ||
original_objective = JuMP.objective_function(model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
original_objective = JuMP.objective_function(model) | |
original_objective_function = JuMP.objective_function(model) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Will be resolved in a separate issue
solution_values = collect(Float64, values(solution)) | ||
# Compute objective value or original LP. | ||
original_objective_value = | ||
extract_objective(original_objective, solution_values, index_map, fixed_variables) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
can you just use something like objective_value(model)? Not fully clear why this new function.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This can probably indeed be done in a simpler manner, I will investigate
# Obtain all constraints from model (including variable bounds). | ||
constraints = JuMP.all_constraints(model, include_variable_in_set_constraints = true) | ||
constraint_functions = map(c -> MOI.get(model, MOI.ConstraintFunction(), c), constraints) | ||
constraint_sets = map(c -> MOI.get(model, MOI.ConstraintSet(), c), constraints) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what's a constraint function, what's a constraint set? Better comment on their definitions, e.g., with an example
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
These are defined as such in MOI documentation. What MOI.ConstraintFunction and MOI.ConstraintSet are is still not clear to me. Maybe you can add a comment explaining this, @marnoldus
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This link explains them best: https://jump.dev/MathOptInterface.jl/stable/manual/constraints/
For linear constraints the Function basically desribes if it is a variable bound or constraint involving multiple parameters and corresponds to the lhs of the constraint, and the Set describes whether it's LessThan, GreaterThan or EqualTo and holds the rhs constant.
push!(gx, c_set.lower - x[index_map[c_fun.value]]) | ||
end | ||
else | ||
throw(ArgumentError("Only linear non-vector constraints are supported.")) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
no quadractic function?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
If it is possible to do this on a higher level, maybe it can be abstracted away. I'll investigate if MOI has methods for doing this in a more automated way
src/alternative-metaheuristics.jl
Outdated
end | ||
end | ||
|
||
# hx should contain 0.0 if no equality constraints extract_constraint |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
what about gx?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
we can try a test with only equality constraints and see if this fails for fx. @marnoldus tagging you in case you know the answer
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
That's a mistake on my side, gx should also be [0.0] if no constraints are there.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@marnoldus could you fix this in this PR?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I fixed it. Just @ me again if I am needed elsewhere, but I probably won't respond again until wednesday or thursday
metric::Distances.SemiMetric, | ||
fixed_variables::Dict{MOI.VariableIndex, Float64}, | ||
) | ||
# Create map between variable indices in JuMP model and indices in individuals of Metaheuristic problem. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
is mapping really needed? I can see a point, but it complicates things a bit in the thought process, and I'm sure if it is really needed.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is converting JuMP indices to MOI indices. Maybe we can use JuMP indices directly or maybe this is implemented in MOI ?
solution = minimizer(result) | ||
function f(x) | ||
f_old, gx, hx = objective(x) | ||
fx = [f_old[1] - Distances.evaluate(metric, solution, x)] |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
shouldn't this be + instead of - if i compare with l69 in NearOptimalAlternatives.jl/src/alternative-optimisation.jl at 2-mga-metaheuristics · TulipaEnergy/NearOptimalAlternatives.jl (github.com)? And do we need to say max somewhere?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is because in NearOptimalAlternatives.jl/src/alternative-optimisation.jl we are maximizing, but in Metaheuristics.jl we are always minimizing. I do agree that this introduces an inconsistency, though. Maybe we should be minimizing there.
Co-authored-by: Ni Wang <[email protected]>
Co-authored-by: Ni Wang <[email protected]>
Co-authored-by: Ni Wang <[email protected]>
# Initialise set of inequality constraints. | ||
gx = Vector{Float64}(undef, 0) | ||
# Add objective gap constraint depending on whether original LP is maximised or minimised. | ||
if JuMP.objective_sense(model) == JuMP.MAX_SENSE |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Can we pull this into a separate function, so that it can be reused in this file and in alternative-optimization.jl? @marnoldus ?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Adding constraints works very different for Metaheuristics (adding to array gx) or JuMP (@constraint), so I think it's not much cleaner when done in a separate function as the bodies of the if and else are very different
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I suspected so, but the problem now is that if you want to change the formula, you need to change it in two places, although it is the same formula. We can think about how to unify these places in code in a separate issue.
Great! Thanks for this PR. |
Pull request details
Describe the changes made in this pull request
Implements finding alternative solutions using metaheuristics. Creates an alternative generating problem from a JuMP model that can be solved by an algorithm implemented in Metaheuristics.jl.
List of related issues or pull requests
Closes #2
Collaboration confirmation
As a contributor I confirm