Skip to content

Commit

Permalink
Merge branch 'py313' of https://github.com/optimagic-dev/optimagic in…
Browse files Browse the repository at this point in the history
…to py313
  • Loading branch information
timmens committed Nov 12, 2024
2 parents 5001533 + bac234c commit 6dc6b34
Show file tree
Hide file tree
Showing 5 changed files with 5 additions and 17 deletions.
14 changes: 1 addition & 13 deletions docs/source/algorithms.md
Original file line number Diff line number Diff line change
Expand Up @@ -392,7 +392,7 @@ install optimagic.
.. warning::
In our benchmark using a quadratic objective function, the trust_constr
algorithm did not find the optimum very precisely (less than 4 decimal places).
If you require high precision, you should refine an optimum found with Powell
If you require high precision, you should refine an optimum found with trust_constr
with another local optimizer.
.. note::
Expand Down Expand Up @@ -907,12 +907,6 @@ We implement a few algorithms from scratch. They are currently considered experi
and therefore may require fewer iterations to arrive at a local optimum than
Nelder-Mead.
The criterion function :func:`func` should return a dictionary with the following
fields:
1. ``"value"``: The sum of squared (potentially weighted) errors.
2. ``"root_contributions"``: An array containing the root (weighted) contributions.
Scaling the problem is necessary such that bounds correspond to the unit hypercube
:math:`[0, 1]^n`. For unconstrained problems, scale each parameter such that unit
changes in parameters result in similar order-of-magnitude changes in the criterion
Expand Down Expand Up @@ -1015,12 +1009,6 @@ need to have [petsc4py](https://pypi.org/project/petsc4py/) installed.
and therefore may require fewer iterations to arrive at a local optimum than
Nelder-Mead.
The criterion function :func:`func` should return a dictionary with the following
fields:
1. ``"value"``: The sum of squared (potentially weighted) errors.
2. ``"root_contributions"``: An array containing the root (weighted) contributions.
Scaling the problem is necessary such that bounds correspond to the unit hypercube
:math:`[0, 1]^n`. For unconstrained problems, scale each parameter such that unit
changes in parameters result in similar order-of-magnitude changes in the criterion
Expand Down
2 changes: 1 addition & 1 deletion docs/source/how_to/how_to_algorithm_selection.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -52,7 +52,7 @@
" E[\"Can you exploit<br/>a least-squares<br/>structure?\"] -- yes --> F[\"differentiable?\"]\n",
" E[\"Can you exploit<br/>a least-squares<br/>structure?\"] -- no --> G[\"differentiable?\"]\n",
"\n",
" F[\"differentiable?\"] -- yes --> H[\"scipy_ls_lm<br/>scipy_ls_trf<br/>scipy_ls_dogleg\"]\n",
" F[\"differentiable?\"] -- yes --> H[\"scipy_ls_lm<br/>scipy_ls_trf<br/>scipy_ls_dogbox\"]\n",
" F[\"differentiable?\"] -- no --> I[\"nag_dflos<br/>pounders<br/>tao_pounders\"]\n",
"\n",
" G[\"differentiable?\"] -- yes --> J[\"scipy_lbfgsb<br/>nlopt_lbfgsb<br/>fides\"]\n",
Expand Down
2 changes: 1 addition & 1 deletion src/optimagic/optimizers/_pounders/gqtpar.py
Original file line number Diff line number Diff line change
Expand Up @@ -55,7 +55,7 @@ def gqtpar(model, x_candidate, *, k_easy=0.1, k_hard=0.2, maxiter=200):
- ``linear_terms``, a np.ndarray of shape (n,) and
- ``square_terms``, a np.ndarray of shape (n,n).
x_candidate (np.ndarray): Initial guess for the solution of the subproblem.
k_easy (float): topping criterion for the "easy" case.
k_easy (float): Stopping criterion for the "easy" case.
k_hard (float): Stopping criterion for the "hard" case.
maxiter (int): Maximum number of iterations to perform. If reached,
terminate.
Expand Down
2 changes: 1 addition & 1 deletion src/optimagic/optimizers/_pounders/pounders_auxiliary.py
Original file line number Diff line number Diff line change
Expand Up @@ -240,7 +240,7 @@ def solve_subproblem(
gtol_rel_conjugate_gradient (float): Convergence tolerance for the relative
gradient norm in the conjugate gradient step of the trust-region
subproblem ("bntr").
k_easy (float): topping criterion for the "easy" case in the trust-region
k_easy (float): Stopping criterion for the "easy" case in the trust-region
subproblem ("gqtpar").
k_hard (float): Stopping criterion for the "hard" case in the trust-region
subproblem ("gqtpar").
Expand Down
2 changes: 1 addition & 1 deletion src/optimagic/optimizers/pounders.py
Original file line number Diff line number Diff line change
Expand Up @@ -262,7 +262,7 @@ def internal_solve_pounders(
gtol_rel_conjugate_gradient_sub (float): Convergence tolerance for the
relative gradient norm in the conjugate gradient step of the trust-region
subproblem if "cg" is used as ``conjugate_gradient_method_sub`` ("bntr").
k_easy_sub (float): topping criterion for the "easy" case in the trust-region
k_easy_sub (float): Stopping criterion for the "easy" case in the trust-region
subproblem ("gqtpar").
k_hard_sub (float): Stopping criterion for the "hard" case in the trust-region
subproblem ("gqtpar").
Expand Down

0 comments on commit 6dc6b34

Please sign in to comment.