-
Notifications
You must be signed in to change notification settings - Fork 49
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Issue with 'gulf' when use_nls= true #359
Comments
I don't get an error for PureJump julia> nlp = OptimizationProblems.PureJuMP.gulf()
A JuMP Model
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: JuMP.NonlinearExpr
├ num_variables: 3
├ num_constraints: 0
└ Names registered in the model
└ :x
julia> nlp = OptimizationProblems.PureJuMP.gulf(T=Float32, n=10, use_nls=true)
A JuMP Model
├ solver: none
├ objective_sense: MIN_SENSE
│ └ objective_function_type: JuMP.NonlinearExpr
├ num_variables: 3
├ num_constraints: 0
└ Names registered in the model
└ :x
|
so it fails for all: :gradient_backend => ForwardDiffADGradient,
:gradient_backend => ReverseDiffADGradient,
:gradient_backend => GenericForwardDiffADGradient,
:gradient_backend => EnzymeReverseADGradient,
:gradient_backend => ZygoteADGradient, |
I think the issue is here: function F!(r, x; m = m)
for i = 1:n
r[i] =
exp(-abs((25 + (-50 * log(i * one(T) / 100))^(2 // 3)) * m * i * x[2])^x[3] / x[1]) -
i // 100
end
return r I think we should use |
Yes, please open a pr. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Hi,
I ran the following code and encountered an error when passing the T (Type), n (size), and use_nls parameters to the gulf() function. Here's the code and the resulting output:
The text was updated successfully, but these errors were encountered: