Skip to content

Conversation

@amontoison
Copy link
Member

@amontoison amontoison commented Dec 1, 2025

Add flags in NLPModelsMeta and NLSMeta to specify whether gradient, sparse Jacobians, sparse Hessians, and operator-based products are available in a model.
In some models, we do not want to, or cannot, implement the complete NLPModels.jl API.

Examples:

  • ADNLPModels.jl: we do not want to set up some AD backends if they are not needed (see issue Disable gradient and Hessian backends for NLSModels (part 2) ADNLPModels.jl#360).
  • NLPModelsJuMP.jl: the user can specify from JuMP which subset of derivatives is needed, and the new VectorNonlinearOracle structure in MOI does not support operator–vector products.
  • Custom AbstractNLPModel implementations: at Argonne, we have some models involving neural networks where only the gradient is available (cc Sarah).

This is an issue for solvers, because solvers such as MadNLP.jl or MadNCL.jl expect jtprod to be implemented but cannot easily know whether it is available before calling it.
A similar issue occurs with UnoSolver.jl, which relies on the BQPD subsolver by default and requires hprod.
The absence of the Lagrangian Hessian can also help solvers like NLPModelsIpopt.jl or NLPModelsKnitro.jl to automatically switch to quasi-Newton approximations.

Using these new attributes also helps an oracle choose the most appropriate solver, and ensures that a clean error is returned when a solver cannot be used with a given model (JSOSuite.jl?).
This is preferable to triggering a missing method error.

This addition should be non-breaking (the full API is considered available by default) and should resolve a number of issues in dependent packages.

We still need to address #523, since these flags cannot be used consistently across the ecosystem until the full migration to 0.22 (which may take months given the number of dependent packages).
I expected it to land in a patch release of 0.21.x.

cc @frapac @sshin23 @swilliamson7 @cvanaret @odow

@github-actions
Copy link
Contributor

github-actions bot commented Dec 1, 2025

Package name latest stable
ADNLPModels
AdaptiveRegularization
AmplNLReader
BundleAdjustmentModels
CUTEst
CaNNOLeS
DCISolver
FletcherPenaltySolver
FluxNLPModels
JSOSolvers
JSOSuite
LLSModels
ManualNLPModels
NLPModelsIpopt
NLPModelsJuMP
NLPModelsKnitro
NLPModelsModifiers
NLPModelsTest
NLSProblems
PDENLPModels
PartiallySeparableNLPModels
PartiallySeparableSolvers
Percival
QuadraticModels
RegularizedProblems
SolverBenchmark
SolverCore
SolverTest
SolverTools

@amontoison
Copy link
Member Author

amontoison commented Dec 1, 2025

  • Documentation is broken because we rely on ADNLPModels.jl and can't use the release 0.22 until a new release which supports it). It also requires a few dependent packages to be updated: Update for NLPModels 0.22 ADNLPModels.jl#363
  • All breakages are failing for a similar reason.

@amontoison amontoison changed the title Add availability flags in NLPModelMeta Add availability flags in NLPModelMeta and NLSMeta Dec 1, 2025
@swilliamson7
Copy link

@amontoison Does this mean my initialization of meta needs to contain new keywords, e.g.

    meta = NLPModelMeta(Lux.parameterlength(Slr.Diag.CNNVars.model_Su) + Lux.parameterlength(Slr.Diag.CNNVars.model_Sv);
        ncon=0,
        nnzh=0,
        x0=param_guess,
        lvar=lvar,
        uvar=uvar, 
        **gradient_available=true**,
        **hessian_available=false**
    )

where I've highlighted the changes with ** (bold doesn't seem to be available in code blocks)

@amontoison
Copy link
Member Author

amontoison commented Dec 1, 2025

@amontoison Does this mean my initialization of meta needs to contain new keywords, e.g.

    meta = NLPModelMeta(Lux.parameterlength(Slr.Diag.CNNVars.model_Su) + Lux.parameterlength(Slr.Diag.CNNVars.model_Sv);
        ncon=0,
        nnzh=0,
        x0=param_guess,
        lvar=lvar,
        uvar=uvar, 
        **gradient_available=true**,
        **hessian_available=false**
    )

where I've highlighted the changes with ** (bold doesn't seem to be available in code blocks)

Yes, you will be able to specify more keyword arguments.
By default, they will be true so if you don't specify them, it means that all the API is supported (current assumption).
We can then exploit this information in the solver (likr Ipopt, MadNLP, KNITRO, Uno).

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Projects

None yet

Development

Successfully merging this pull request may close these issues.

3 participants