Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for MathOptInterface.AbstractOptimizer that do not support MathOptInterface.NLPBlock #369

Closed
ValentinKaisermayer opened this issue Sep 1, 2022 · 7 comments · Fixed by #381

Comments

@ValentinKaisermayer
Copy link
Contributor

No description provided.

@ValentinKaisermayer
Copy link
Contributor Author

There are two possibilities for this

  • Use JuMP variable refs and put them through the generated objective and contraints functions. If they are affine and/or quadratic the result will be affine or quadratic expressions. This would introduce a dependency onto JuMP.
  • Use the expressions directly and generate the affine and quadratic MOI terms directly from them.

I prefere option two.

@ChrisRackauckas Would it be ok to add Symbolics.jl as dependency for this? The other possibility is to generate the terms from the Julia expression. In both cases I would have to simplify and expand first, in order to be sure that it is really just an affine and/or quadratic expression. For a symbolic expression there is simplify. How could I achieve his from a Julia expression directly?

Another issue I see when using the Julia expressions in the problem, is that the parameters have already been replaced so no reinit! (as discussed in #352 ) would be possible. This is of course true for the current use case of the expressions as well.

@ChrisRackauckas
Copy link
Member

Yes, adding Symbolics.jl for this is good. Hessian sparsity detection would tell you if it's linear, and then you calculate the Hessian to see if non-constant to get if it's quadratic. @shashi might have a better way via the tracer to make it return true/false for quadratic. In theory the algorithm would already know this because it replaces nonlinear expressions with polynomials, and if it ever does that if could flip a bool.

@ValentinKaisermayer
Copy link
Contributor Author

it replaces nonlinear expressions with polynomials, and if it ever does that if could flip a bool.

This seems more like a job for MTK but would be very cool indeed and somewhat cutting edge.

@manuelbb-upb
Copy link

I recently thought about this, too, when discovering coeff and degree in Symbolics.

Coming from JuMP, I think there are some issues in Symbolics and SymbolicUtils with vector-algebra that we can keep in mind to make setting up linear or quadratic problems easy and avoid unexpected errors with scalarization or simplification.
These are some of the pull-request I opened to (try to) address some of the issues I encountered when exploring Symbolics for this use-case:

As I am still relatively new to the whole ecosystem, some of the proposed changes might very-well be very dumb. I just wanted to link them because they popped up when thinking about Symbolics for LP and QP.

PS: if re-instantiation of an optimization problem with different parameters can be made to work, this would be great, because currently that's not well-supported in ParametricOptInterface, e.g., sum((param_matrix * variable_vector).^2) does not work there...

@ValentinKaisermayer
Copy link
Contributor Author

The re-initalization will be handled via a caching interface, see #406, also for MOI.
I started a PR for the MOI solver interface for LPs and QPs #381. It works via expressions and low-level MOI calls, but I feel it would be better to use MTK for simplification and checking if the problem is linear or quadratic and just using JuMP directly.

@manuelbb-upb
Copy link

I have just skimmed a few of the implementation in #381. Personally, I don't see the harm in using low-level MOI calls, but the Expr parsing indeed looks like something to outsource. Could the parsing and simplification be handled by Symbolics alone (avoiding MTK as dependency)? Then coeff and degree could perhaps be used to define the low-level MOI structures, as long as the parameter and constant metadata is there.

Please excuse, if these are silly questions, as I am not well versed in the internals of Optimization.jl and MTK 🙈
What I originally thought about was using Symbolics as an alternative interface to MOI, independent even from Optimization.jl: Set up a caching MOI Optimizer that can utilize all the cool bridges for LP and QP, but generate the MOI functions with Symbolics instead of JuMP. Also borrow the MTK parameter metadata (separate mini-package?) to distinguish variables from parameters and then have some substitution mechanism for expressions like sum((param_matrix * variable_vector).^2) which don't yet work with ParametricOptInterface and JuMP.
In the end, it was merely a thought experiment without pressing priority, but much of what is in #381 could be re-used for something like this.

@ValentinKaisermayer
Copy link
Contributor Author

I have to say the idea is actually quite nice. JuMP lacks an interface for parameters. At least a real on, ParametricOptInterface can not handle more complex stuff.
I have experimented with that already and I think it would be not difficult to make an interface. I.e. Given an OptimizationProblem, replace the parameters, analyse the expression, if linear or quadratic build a JuMP.AffExpr or JuMP.QuadExpr or if not a nonlinear one. Something like a constructor for a JuMP.Model from an OptimizationProblem.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants