Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Issue with 'gulf' when use_nls= true #359

Open
farhadrclass opened this issue Dec 21, 2024 · 5 comments
Open

Issue with 'gulf' when use_nls= true #359

farhadrclass opened this issue Dec 21, 2024 · 5 comments

Comments

@farhadrclass
Copy link
Contributor

Hi,

I ran the following code and encountered an error when passing the T (Type), n (size), and use_nls parameters to the gulf() function. Here's the code and the resulting output:

julia> nlp = OptimizationProblems.ADNLPProblems.gulf()
ADNLPModel - Model with automatic differentiation backend ADModelBackend{
  ForwardDiffADGradient,
  ForwardDiffADHvprod,
  EmptyADbackend,
  EmptyADbackend,
  EmptyADbackend,
  SparseADHessian,
  EmptyADbackend,
}
  Problem name: gulf
   All variables: ████████████████████ 3      All constraints: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
            free: ████████████████████ 3                 free: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                lower: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                upper: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
         low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0              low/upp: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           fixed: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                fixed: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
          infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               infeas: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
            nnzh: (  0.00% sparsity)   6               linear: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
                                                    nonlinear: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
                                                         nnzj: (------% sparsity)

  Counters:
             obj: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 grad: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 cons: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
        cons_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0             cons_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 jcon: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           jgrad: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                  jac: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0              jac_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
         jac_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                jprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0            jprod_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
       jprod_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               jtprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0           jtprod_lin: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
      jtprod_nln: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                 hess: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0                hprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0
           jhess: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0               jhprod: ⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅⋅ 0


julia> nlp = OptimizationProblems.ADNLPProblems.gulf(T=T,n=10, use_nls=true)
ERROR: UndefRefError: access to undefined reference
Stacktrace:
  [1] getindex
    @ .\essentials.jl:13 [inlined]
  [2] macro expansion
    @ .\reduce.jl:264 [inlined]
  [3] macro expansion
    @ .\simdloop.jl:77 [inlined]
  [4] mapreduce_impl(f::ADNLPModels.var"#8#18", op::typeof(+), A::Vector{…}, ifirst::Int64, ilast::Int64, blksize::Int64)
    @ Base .\reduce.jl:263
  [5] mapreduce_impl
    @ .\reduce.jl:277 [inlined]
  [6] _mapreduce(f::ADNLPModels.var"#8#18", op::typeof(+), ::IndexLinear, A::Vector{…})
    @ Base .\reduce.jl:447
  [7] _mapreduce_dim(f::Function, op::Function, ::Base._InitialValue, A::Vector{…}, ::Colon)
    @ Base .\reducedim.jl:365
  [8] mapreduce(f::Function, op::Function, A::Vector{SparseConnectivityTracer.HessianTracer{…}})
    @ Base .\reducedim.jl:357
  [9] (::ADNLPModels.var"#7#17")(x::Vector{SparseConnectivityTracer.HessianTracer{…}})
    @ ADNLPModels C:\Users\Farhad\.julia\packages\ADNLPModels\hHKR7\src\ad.jl:236
 [10] (::ADNLPModels.var"#lagrangian#55"{})(x::Vector{…})
    @ ADNLPModels C:\Users\Farhad\.julia\packages\ADNLPModels\hHKR7\src\sparsity_pattern.jl:44
 [11] trace_function(::Type{…}, f::ADNLPModels.var"#lagrangian#55"{}, x::Vector{…})
    @ SparseConnectivityTracer C:\Users\Farhad\.julia\packages\SparseConnectivityTracer\22dWp\src\trace_functions.jl:48
 [12] _hessian_sparsity(f::Function, x::Vector{Float64}, ::Type{SparseConnectivityTracer.HessianTracer{…}})
    @ SparseConnectivityTracer C:\Users\Farhad\.julia\packages\SparseConnectivityTracer\22dWp\src\trace_functions.jl:106
 [13] hessian_sparsity
    @ C:\Users\Farhad\.julia\packages\SparseConnectivityTracer\22dWp\src\adtypes_interface.jl:64 [inlined]
 [14] compute_hessian_sparsity(f::ADNLPModels.var"#7#17", nvar::Int64, c!::ADNLPModels.var"#9#19", ncon::Int64; detector::SparseConnectivityTracer.TracerSparsityDetector{…})
    @ ADNLPModels C:\Users\Farhad\.julia\packages\ADNLPModels\hHKR7\src\sparsity_pattern.jl:53
 [15] ADNLPModels.SparseADHessian(nvar::Int64, f::Function, ncon::Int64, c!::ADNLPModels.var"#9#19"; x0::Vector{…}, coloring_algorithm::SparseMatrixColorings.GreedyColoringAlgorithm{…}, detector::SparseConnectivityTracer.TracerSparsityDetector{…}, kwargs::@Kwargs{})
    @ ADNLPModels C:\Users\Farhad\.julia\packages\ADNLPModels\hHKR7\src\sparse_hessian.jl:29
 [16] macro expansion
    @ C:\Users\Farhad\.julia\packages\ADNLPModels\hHKR7\src\ad.jl:266 [inlined]
 [17] macro expansion
    @ .\timing.jl:395 [inlined]
 [18] ADModelNLSBackend(nvar::Int64, F!::OptimizationProblems.ADNLPProblems.var"#F!#784"{}, nequ::Int64; backend::Symbol, matrix_free::Bool, show_time::Bool, gradient_backend::Type, hprod_backend::Type, hessian_backend::Type, hprod_residual_backend::Type, jprod_residual_backend::Type, jtprod_residual_backend::Type, jacobian_residual_backend::Type, hessian_residual_backend::Type, kwargs::@Kwargs{})
    @ ADNLPModels C:\Users\Farhad\.julia\packages\ADNLPModels\hHKR7\src\ad.jl:262
 [19] ADNLSModel!(F!::Function, x0::Vector{…}, nequ::Int64; linequ::Vector{…}, name::String, minimize::Bool, kwargs::@Kwargs{})
    @ ADNLPModels C:\Users\Farhad\.julia\packages\ADNLPModels\hHKR7\src\nls.jl:163
 [20] ADNLSModel!
    @ C:\Users\Farhad\.julia\packages\ADNLPModels\hHKR7\src\nls.jl:151 [inlined]
 [21] #gulf#782
    @ C:\Users\Farhad\.julia\packages\OptimizationProblems\G0vFO\src\ADNLPProblems\gulf.jl:46 [inlined]
 [22] gulf
    @ C:\Users\Farhad\.julia\packages\OptimizationProblems\G0vFO\src\ADNLPProblems\gulf.jl:28 [inlined]
 [23] gulf(; use_nls::Bool, kwargs::@Kwargs{T::DataType, n::Int64})
    @ OptimizationProblems.ADNLPProblems C:\Users\Farhad\.julia\packages\OptimizationProblems\G0vFO\src\ADNLPProblems\gulf.jl:5
 [24] top-level scope
    @ REPL[38]:1
Some type information was truncated. Use `show(err)` to see complete types.
@farhadrclass
Copy link
Contributor Author

I don't get an error for PureJump

julia> nlp = OptimizationProblems.PureJuMP.gulf()
A JuMP Model
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: JuMP.NonlinearExpr
├ num_variables: 3
├ num_constraints: 0
└ Names registered in the model
  └ :x

julia> nlp = OptimizationProblems.PureJuMP.gulf(T=Float32, n=10, use_nls=true)
A JuMP Model
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: JuMP.NonlinearExpr
├ num_variables: 3
├ num_constraints: 0
└ Names registered in the model
  └ :x

@farhadrclass
Copy link
Contributor Author

so it fails for all:

  :gradient_backend => ForwardDiffADGradient,
  :gradient_backend => ReverseDiffADGradient,
  :gradient_backend => GenericForwardDiffADGradient,
  :gradient_backend => EnzymeReverseADGradient,
  :gradient_backend => ZygoteADGradient,

@farhadrclass
Copy link
Contributor Author

farhadrclass commented Dec 21, 2024

I think the issue is here:

 function F!(r, x; m = m)
    for i = 1:n
      r[i] =
        exp(-abs((25 + (-50 * log(i * one(T) / 100))^(2 // 3)) * m * i * x[2])^x[3] / x[1]) -
        i // 100
    end
    return r

I think we should use for i = 1:m ?

@dpo
Copy link
Member

dpo commented Dec 22, 2024

Yes, please open a pr.

@dpo
Copy link
Member

dpo commented Dec 23, 2024

@farhadrclass

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants