Skip to content

Commit

Permalink
Polishing.
Browse files Browse the repository at this point in the history
  • Loading branch information
janosg committed Nov 4, 2024
1 parent f5dfe44 commit de76d5c
Showing 1 changed file with 21 additions and 27 deletions.
48 changes: 21 additions & 27 deletions docs/source/how_to/how_to_algorithm_selection.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -19,21 +19,22 @@
"optimizers fail silently!\n",
"\n",
"\n",
"## The four steps for selecting algorithms\n",
"## The three steps for selecting algorithms\n",
"\n",
"Algorithm selection is a mix of theory and experimentation. We recommend the following \n",
"four steps:\n",
"\n",
"1. **Theory**: Based on the properties of your problem, start with 3 to 5 candidate algorithms. \n",
"You may use the [decision tree below](link)\n",
"2. **Experiments**: Run the candidate algorithms for a small number of function \n",
"evaluations. As a rule of thumb, use between `n_params` and `10 * n_params`\n",
"evaluations. \n",
"3. **Comparison**: Compare the results in a *criterion plot*.\n",
"4. **Optimization**: Re-run the algorithm with the best results until \n",
"evaluations and compare the results in a *criterion plot*. As a rule of thumb, use \n",
"between `n_params` and `10 * n_params` evaluations. \n",
"3. **Optimization**: Re-run the algorithm with the best results until \n",
"convergence. Use the best parameter vector from the experiments as start parameters.\n",
"\n",
"These steps work well for most problems. Sometimes you need [variations](four-steps-variations).\n",
"We will walk you through the steps in an [example](algo-selection-example-problem)\n",
"below. These steps work well for most problems but sometimes you need \n",
"[variations](algo-selection-steps-variations).\n",
"\n",
"\n",
"## A decision tree \n",
Expand Down Expand Up @@ -74,6 +75,8 @@
"found out through experimentation.\n",
"```\n",
"\n",
"(algo-selection-example-problem)=\n",
"\n",
"## An example problem\n",
"\n",
"As an example we use the [Trid function](https://www.sfu.ca/~ssurjano/trid.html). The Trid function has no local minimum except \n",
Expand Down Expand Up @@ -131,12 +134,19 @@
"\n",
"1. **No** nonlinear constraints our solution needs to satisfy\n",
"2. **No** no least-squares structure we can exploit \n",
"3. **Yes**, the function is differentiable and we have a closed form gradient that we would like \n",
"to use. \n",
"3. **Yes**, the function is differentiable. We even have a closed form gradient that \n",
"we would like to use. \n",
"\n",
"We therefore end up with the candidate algorithms `scipy_lbfgsb`, `nlopt_lbfgsb`, and \n",
"`fides`.\n",
"\n",
"```{note}\n",
"If your function is differentiable but you do not have a closed form gradient (yet), \n",
"we suggest to use at least one gradient based optimizer and one gradient free optimizer.\n",
"in your experiments. Optimagic will use numerical gradients in that case. For details, \n",
"see [here](how_to_derivatives.ipynb).\n",
"```\n",
"\n",
"\n",
"### Step 2: Experiments\n",
"\n",
Expand All @@ -160,24 +170,8 @@
" params=np.arange(20),\n",
" algorithm=algo,\n",
" algo_options={\"stopping_maxfun\": 8, \"stopping_maxiter\": 8},\n",
" )"
]
},
{
"cell_type": "markdown",
"metadata": {},
"source": [
"### Step 3: Comparison\n",
" )\n",
"\n",
"Next we plot the optimizer histories to find out which optimizer worked best:"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": [
"fig = om.criterion_plot(results, max_evaluations=8)\n",
"fig.show(renderer=\"png\")"
]
Expand All @@ -191,7 +185,7 @@
"better than the others, so we will select it for the next step. In more difficult\n",
"examples, the difference between optimizers can be much more pronounced.\n",
"\n",
"### Step 4: Optimization \n",
"### Step 3: Optimization \n",
"\n",
"All that is left to do is to run the optimization until convergence with the best \n",
"optimizer. To avoid duplicated calculations, we can already start from the previously \n",
Expand Down Expand Up @@ -235,7 +229,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"(four-steps-variations)=\n",
"(algo-selection-steps-variations)=\n",
"\n",
"## Variations of the four steps\n",
"\n",
Expand Down

0 comments on commit de76d5c

Please sign in to comment.