Skip to content

Conversation

@klamike
Copy link
Contributor

@klamike klamike commented Feb 9, 2026

Not sure if this is the right fix, maybe I am doing something wrong? Just trying to use nonlinear_diff_model then set_optimizer_attribute:

julia> using MadNLP, MadNLPGPU, CUDA, DiffOpt, JuMP

julia> m = DiffOpt.nonlinear_diff_model(MadNLP.Optimizer)
A JuMP Model
├ mode: DIRECT
├ solver: MadNLP
├ objective_sense: FEASIBILITY_SENSE
├ num_variables: 0
├ num_constraints: 0
└ Names registered in the model: none

julia> set_optimizer_attribute(m, "array_type", CuArray)
ERROR: MathOptInterface.UnsupportedAttribute{MathOptInterface.RawOptimizerAttribute}: Attribute MathOptInterface.RawOptimizerAttribute("array_type") is not supported by the model.
Stacktrace:
 [1] throw_set_error_fallback(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{…}}, attr::MathOptInterface.RawOptimizerAttribute, value::Type; error_if_supported::MathOptInterface.SetAttributeNotAllowed{MathOptInterface.RawOptimizerAttribute})
   @ MathOptInterface julia_depot/packages/MathOptInterface/Q3V1z/src/attributes.jl:662
 [2] throw_set_error_fallback(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{…}, MathOptInterface.Utilities.UniversalFallback{…}}}, attr::MathOptInterface.RawOptimizerAttribute, value::Type)
   @ MathOptInterface julia_depot/packages/MathOptInterface/Q3V1z/src/attributes.jl:653
 [3] set(model::DiffOpt.Optimizer{MathOptInterface.Utilities.CachingOptimizer{MathOptInterface.Bridges.LazyBridgeOptimizer{…}, MathOptInterface.Utilities.UniversalFallback{…}}}, attr::MathOptInterface.RawOptimizerAttribute, args::Type)
   @ MathOptInterface julia_depot/packages/MathOptInterface/Q3V1z/src/attributes.jl:626
 [4] set(m::Model, attr::MathOptInterface.RawOptimizerAttribute, value::Type)
   @ JuMP julia_depot/packages/JuMP/7eD71/src/optimizer_interface.jl:1264
 [5] set_attribute
   @ julia_depot/packages/JuMP/7eD71/src/optimizer_interface.jl:1517 [inlined]
 [6] set_attribute
   @ julia_depot/packages/JuMP/7eD71/src/optimizer_interface.jl:1526 [inlined]
 [7] set_optimizer_attribute(model::Model, attr::String, value::Type)
   @ JuMP julia_depot/packages/JuMP/7eD71/src/optimizer_interface.jl:77
 [8] top-level scope
   @ REPL[5]:1
Some type information was truncated. Use `show(err)` to see complete types.

julia> function MOI.set(model::DiffOpt.Optimizer, attr::MOI.AbstractOptimizerAttribute, value)
           MOI.set(model.optimizer, attr, value)
       end

julia> set_optimizer_attribute(m, "array_type", CuArray)

julia> @variable m x
x

julia> @constraint m c1 x  1
c1 : x  1

julia> @objective m Min x^2
x²

julia> optimize!(m)
The following options are ignored: 
 - array_type
This is MadNLP version v0.8.12, running with cuDSS v0.7.1
...

(Note the "The following options are ignored" message is a bug in MadNLP master that will be fixed before the next release, as you can see below it is in fact using cuDSS)

@codecov
Copy link

codecov bot commented Feb 9, 2026

Codecov Report

✅ All modified and coverable lines are covered by tests.
✅ Project coverage is 89.14%. Comparing base (dc86850) to head (71e701a).
⚠️ Report is 1 commits behind head on master.

Additional details and impacted files
@@            Coverage Diff             @@
##           master     #335      +/-   ##
==========================================
+ Coverage   89.13%   89.14%   +0.01%     
==========================================
  Files          16       16              
  Lines        1961     1963       +2     
==========================================
+ Hits         1748     1750       +2     
  Misses        213      213              

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

🚀 New features to boost your workflow:
  • ❄️ Test Analytics: Detect flaky tests, report on failures, and find test suite problems.

@andrewrosemberg
Copy link
Collaborator

andrewrosemberg commented Feb 9, 2026

I'm trying to think if there is any attribute that could break diffopt if it got passed but I don't think so. LGTM after format and could be nice to add some tests.

@andrewrosemberg andrewrosemberg merged commit 56efbf0 into jump-dev:master Feb 9, 2026
7 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Development

Successfully merging this pull request may close these issues.

2 participants