Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implement generating alternatives using metaheuristics #13

Merged
merged 13 commits into from
Jun 17, 2024

Conversation

marnoldus
Copy link
Collaborator

@marnoldus marnoldus commented May 24, 2024

Pull request details

Describe the changes made in this pull request

Implements finding alternative solutions using metaheuristics. Creates an alternative generating problem from a JuMP model that can be solved by an algorithm implemented in Metaheuristics.jl.

List of related issues or pull requests

Closes #2

Collaboration confirmation

As a contributor I confirm

  • The documentation is up to date with the changes introduced in this Pull Request (or NA)
  • Tests are passing
  • Lint is passing

@marnoldus
Copy link
Collaborator Author

@abelsiqueira as you can see this pull request fails on Julia 1.6. The reason for this is that the Metaheuristics package I use requires a minimum Julia version of 1.7. Using an earlier version of Metaheuristics.jl gives errors in the current code. What do you think is the best way to resolve this? Should we add a compatibility requirement in the project.toml that julia should be at least 1.7? Then we should also update the continuous integration.

@abelsiqueira
Copy link
Member

Hi @marnoldus, there are two possibilities:

  1. What you described, change 1.6 to 1.7 in all places it appears.
  2. Use Package extensions to only load the relevant metaheurisitics functions if the Metaheuristics package is loaded (which should be impossible in 1.6 because it's broken) and/or check the version number and act accordingly at package load time.

Solution 1 is simpler and straightforward. I prefer it.
The drawback is that the other packages are requiring 1.6, so there is some incompatibility - expectations are broken.

Solution 2 involves too many moving parts and is hard to maintain. It requires manual intervention when the LTS version of Julia changes, and it will create the false expectation that the package works normally on 1.6, which is not the case.

For me it's a clear choice to drop 1.6, but I'm tagging @clizbe for approval.

@clizbe
Copy link
Member

clizbe commented May 28, 2024

Solution 1 sounds better. It just means the rest of Tulipa requires Julia 1.6 or later, while MGA requires Julia 1.7 or later, right?

@abelsiqueira
Copy link
Member

Yes

@marnoldus marnoldus marked this pull request as draft May 28, 2024 11:55
@marnoldus marnoldus marked this pull request as ready for review May 28, 2024 13:41
@gnawin
Copy link
Member

gnawin commented Jun 5, 2024

Thanks for all the work! I added some comments while reading the code for the first time. Most of them are about clarification from an outsider point of view, Although I can understand most after some googling/testing, but it would be nice if there can be more comments so I don't have to jump between things 😄 .

Not sure if you're still around to follow this up @marnoldus but I tag you both so you are both informed @greg-neustroev.

And there is an issue #22 with my comments for the existing code.

Copy link
Member

@gnawin gnawin left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Here's my review.

fixed_variables::Dict{MOI.VariableIndex, Float64}
)

Convert a constraint from a MathOptInterface function into a julia function of x. Supports only ScalarAffineFunction and VariableIndex constraints.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Calculating the value (not sure about this term) of a constraint instead of transforming a constraint? e.g., this function would return 3 when x=[1,2] for x_1 + x_2 >= 1; according to l27, it only supports ScalarAffineFunction?

Copy link
Contributor

@greg-neustroev greg-neustroev Jun 17, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This calculates the value of a scalar affine function that represents a constraint; it is based on the data in MOI.ScalarAffineFunction. Maybe there is an easier way to do that, e.g., a functioned called something like "apply constraint"? I'll investigate

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

would VectorAffineFunction work faster? I'll investigate

index_map::Dict{Int64, Int64},
fixed_variables::Dict{MOI.VariableIndex, Float64},
)
original_objective = JuMP.objective_function(model)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
original_objective = JuMP.objective_function(model)
original_objective_function = JuMP.objective_function(model)

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Will be resolved in a separate issue

src/alternative-metaheuristics.jl Outdated Show resolved Hide resolved
solution_values = collect(Float64, values(solution))
# Compute objective value or original LP.
original_objective_value =
extract_objective(original_objective, solution_values, index_map, fixed_variables)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

can you just use something like objective_value(model)? Not fully clear why this new function.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This can probably indeed be done in a simpler manner, I will investigate

# Obtain all constraints from model (including variable bounds).
constraints = JuMP.all_constraints(model, include_variable_in_set_constraints = true)
constraint_functions = map(c -> MOI.get(model, MOI.ConstraintFunction(), c), constraints)
constraint_sets = map(c -> MOI.get(model, MOI.ConstraintSet(), c), constraints)
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what's a constraint function, what's a constraint set? Better comment on their definitions, e.g., with an example

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

These are defined as such in MOI documentation. What MOI.ConstraintFunction and MOI.ConstraintSet are is still not clear to me. Maybe you can add a comment explaining this, @marnoldus

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This link explains them best: https://jump.dev/MathOptInterface.jl/stable/manual/constraints/
For linear constraints the Function basically desribes if it is a variable bound or constraint involving multiple parameters and corresponds to the lhs of the constraint, and the Set describes whether it's LessThan, GreaterThan or EqualTo and holds the rhs constant.

push!(gx, c_set.lower - x[index_map[c_fun.value]])
end
else
throw(ArgumentError("Only linear non-vector constraints are supported."))
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

no quadractic function?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

If it is possible to do this on a higher level, maybe it can be abstracted away. I'll investigate if MOI has methods for doing this in a more automated way

end
end

# hx should contain 0.0 if no equality constraints extract_constraint
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

what about gx?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

we can try a test with only equality constraints and see if this fails for fx. @marnoldus tagging you in case you know the answer

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

That's a mistake on my side, gx should also be [0.0] if no constraints are there.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@marnoldus could you fix this in this PR?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I fixed it. Just @ me again if I am needed elsewhere, but I probably won't respond again until wednesday or thursday

metric::Distances.SemiMetric,
fixed_variables::Dict{MOI.VariableIndex, Float64},
)
# Create map between variable indices in JuMP model and indices in individuals of Metaheuristic problem.
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

is mapping really needed? I can see a point, but it complicates things a bit in the thought process, and I'm sure if it is really needed.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is converting JuMP indices to MOI indices. Maybe we can use JuMP indices directly or maybe this is implemented in MOI ?

solution = minimizer(result)
function f(x)
f_old, gx, hx = objective(x)
fx = [f_old[1] - Distances.evaluate(metric, solution, x)]
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

shouldn't this be + instead of - if i compare with l69 in NearOptimalAlternatives.jl/src/alternative-optimisation.jl at 2-mga-metaheuristics · TulipaEnergy/NearOptimalAlternatives.jl (github.com)? And do we need to say max somewhere?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is because in NearOptimalAlternatives.jl/src/alternative-optimisation.jl we are maximizing, but in Metaheuristics.jl we are always minimizing. I do agree that this introduces an inconsistency, though. Maybe we should be minimizing there.

test/test-alternative-metaheuristics.jl Outdated Show resolved Hide resolved
greg-neustroev and others added 3 commits June 17, 2024 13:14
Co-authored-by: Ni Wang <[email protected]>
Co-authored-by: Ni Wang <[email protected]>
Co-authored-by: Ni Wang <[email protected]>
# Initialise set of inequality constraints.
gx = Vector{Float64}(undef, 0)
# Add objective gap constraint depending on whether original LP is maximised or minimised.
if JuMP.objective_sense(model) == JuMP.MAX_SENSE
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Can we pull this into a separate function, so that it can be reused in this file and in alternative-optimization.jl? @marnoldus ?

Copy link
Collaborator Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Adding constraints works very different for Metaheuristics (adding to array gx) or JuMP (@constraint), so I think it's not much cleaner when done in a separate function as the bodies of the if and else are very different

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I suspected so, but the problem now is that if you want to change the formula, you need to change it in two places, although it is the same formula. We can think about how to unify these places in code in a separate issue.

@gnawin
Copy link
Member

gnawin commented Jun 17, 2024

Great! Thanks for this PR.

@gnawin gnawin merged commit 0c6eee8 into main Jun 17, 2024
9 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Implement MGA using metaheuristics
5 participants