Replies: 1 comment
-
The builtin cost functions like ExtendedUnbinnedNLL currently do not support models with gradients. This should be added as a feature. If you want to use gradients at the moment, you need to write the negative log-likelihood function yourself and compute the gradient of that. What you were trying is to pass the gradient of the model (the PDF) to Minuit, but you need to pass the gradient of the ExtendedUnbinnedNLL to iminuit. In the future, the builtin cost functions should allow passing a gradient of the model and then pass on the gradient of the cost function to iminuit automatically. But this is not implemented yet. Supporting gradients does not have high priority, since the studies showed that a precomputed gradient does not provide a large benefit. Internally, Minuit's Migrad algorithm is also not well prepared to handle user-provided gradients. |
Beta Was this translation helpful? Give feedback.
-
Hello to iminuit community and sages!
I am studying how to supply Imiuit with a user-specified gradient function. The example of using automatic differentiation gave pointers on what is possible, however I would like to provide the minimizer with a gradient function directly and seem to have trouble understanding what form iminuit needs it in, as the snippet runs into errors. Here is bit of example code to reproduce the situation:
With this an error
appeared. I thought that perhaps the gradient would need to be in form gradient(x,*params) -> gradient(*params)
Trying this option produced the following error:
There must be something that I am missing here but can't quite pick up on. How would one need to go about supplying the gradient to iminuit?
Beta Was this translation helpful? Give feedback.
All reactions