Skip to content

Commit

Permalink
updated docs for callbacks
Browse files Browse the repository at this point in the history
  • Loading branch information
lmiq committed Jul 22, 2023
1 parent fe9be0b commit a857c9d
Show file tree
Hide file tree
Showing 3 changed files with 70 additions and 16 deletions.
3 changes: 3 additions & 0 deletions docs/Project.toml
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
[deps]
Documenter = "e30172f5-a6a5-5a46-863b-614d45cd2de4"
SPGBox = "bf97046b-3e66-4aa0-9aed-26efb7fac769"
11 changes: 7 additions & 4 deletions docs/src/options.md
Original file line number Diff line number Diff line change
Expand Up @@ -24,21 +24,20 @@ julia> function fg!(g,x)
end
fg! (generic function with 1 method)
julia> x = rand(3);
julia> x = [10.0, 11.0, 12.0];
julia> spgbox(fg!,x)
SPGBOX RESULT:
Convergence achieved.
Convergence achieved. (Return from callback: false).
Final objective function value = 0.0
Sample of best point = Vector{Float64}[ 1.0, 2.0, 3.0]
Projected gradient norm = 0.0
Number of iterations = 3
Number of function evaluations = 3
```

## Convergence criteria
Expand All @@ -51,7 +50,6 @@ These keywords provided to `spgbox!` with, for example:

```julia-repl
julia> R = spgbox!(f,g!,x,nitmax=1000)
```

where `nitmax`, in this case, is the maximum number of iterations.
Expand All @@ -64,6 +62,7 @@ The available keywords are:
| `nfevalmax` | `Integer` | Maximum number of function evaluations allowed. | `1000` |
| `eps` | `eltype(x)` | Convergence criteria for the projected gradient norm. | `1e-5` |
| `m` | `Integer` | Number of non-monotone search steps. | `10` |
| | | | |

## Memory preallocation

Expand Down Expand Up @@ -167,6 +166,10 @@ Additional keywords available:
|:-------------:|:-------------:|:-------------:|:--------------:|
| `iprint` | `Integer` | Printing details (0, 1, or 2) | `0` |
| `project_x0` | `Bool` | Projects, or not, the initial point on the bounds. | `true` |
| `callback` | `Function` | Callback function | `(::SPGBoxResult) -> false` |
| `lower` | `AbstractVecOrMat` | Array of lower bounds | `nothing` |
| `upper` | `AbstractVecOrMat` | Array of upper bounds | `nothing` |
| | | | |



Expand Down
72 changes: 60 additions & 12 deletions docs/src/usage.md
Original file line number Diff line number Diff line change
Expand Up @@ -29,6 +29,8 @@ The solver function is `spgbox!`, which mutates the input value of `x` (with the

The solver calls have a minimal calling syntax of
```julia-repl
julia> using SPGBox
julia> x = rand(2);
julia> R = spgbox(f,g!,x)
Expand All @@ -38,19 +40,18 @@ The results will be returned to the data structure `R` of type
`SPGBoxResult`, and will be output as:

```julia-repl
julia> R = spgbox(f,g!,x)
julia> R = spgbox(f,g!,x,lower=[-Inf,5])
SPGBOX RESULT:
Convergence achieved.
Convergence achieved. (Return from callback: false).
Final objective function value = 0.0
Sample of best point = Vector{Float64}[ 0.0, 2.0]
Final objective function value = 9.0
Sample of best point = Vector{Float64}[ 0.0, 5.0]
Projected gradient norm = 0.0
Number of iterations = 2
Number of iterations = 3
Number of function evaluations = 3
```

## Calling the solver, with box bounds
Expand All @@ -62,21 +63,22 @@ function and gradient functions defined in the example above, a lower
bound will be set for the second variable:

```julia-repl
julia> using SPGBox
julia> x = rand(2);
julia> R = spgbox(f,g!,x,lower=[-Inf,5])
SPGBOX RESULT:
Convergence achieved.
Convergence achieved. (Return from callback: false).
Final objective function value = 9.0
Sample of best point = Vector{Float64}[ 0.0, 5.0]
Projected gradient norm = 0.0
Number of iterations = 3
Number of function evaluations = 3
```

Upper bounds can be similarly set with `upper=[+Inf,-5]`, for example.
Expand All @@ -101,6 +103,7 @@ struct SPGBoxResult
nit :: Int64
nfeval :: Int64
ierr :: Int64
return_from_callback::Bool
end
```

Expand All @@ -121,6 +124,8 @@ The data structure contains:
| `nit` | Number of iterations performed. |
| `nfeval` | Number of function evaluations. |
| `ierr` | Exit status. |
| `return_from_callback` | Boolan that states if the return was defined by the callback function. |
| | |

The possible outcomes of `ierr` are:

Expand All @@ -129,15 +134,52 @@ The possible outcomes of `ierr` are:
| `ierr=0` | Success: convergence achieved. |
| `ierr=1` | Maximum number of iterations achieved. |
| `ierr=2` | Maximum number of function evaluations achieved. |
| | |

The convergence criteria can be adjusted using optional keywords, as
described in the [Options](@ref Options) section.

## Data-dependent function evaluation
## Optional callback function

If the function requires additional parameters, two strategies are
possible while preserving performance: 1) Declare the parameters as constants
and define an extra method, or 2) Pass the function as an anonymous closure.
It is possible to pass an optional `callback` function parameter to the `spgbox` and `spgbox!` functions.
The `callback` function must:

1. Receive as argument a `SPGBoxResult` data structure.
2. Return `true` or `false` to define if the algorithm must return immediately or continue.

For example, here we stop the optimization when the sum of two variables becomes smaller
than `5.0`.

```julia-repl
julia> using SPGBox
julia> f(x) = x[1]^4 + (x[2] - 1)^4
f (generic function with 1 method)
julia> function g!(g, x)
g[1] = 4 * x[1]^3
g[2] = 4 * (x[2] - 1)^3
end
g! (generic function with 1 method)
julia> x = [10.0, 18.0];
julia> my_callback(R::SPGBoxResult) = R.x[1] + R.x[2] < 5.0 ? true : false
my_callback (generic function with 1 method)
julia> R = spgbox!(f, g!, x; callback = my_callback)
SPGBOX RESULT:
Convergence achieved. (Return from callback: true).
Final objective function value = 11.341529752085066
Sample of best point = Vector{Float64}[ 1.5429284794371168, 2.543387528602126]
Projected gradient norm = 14.705674573954493
Number of iterations = 10
Number of function evaluations = 10
```

## Input data types

Expand Down Expand Up @@ -185,6 +227,12 @@ julia> spgbox(f,g!,x)
```

## Data-dependent function evaluation

If the function requires additional parameters, two strategies are
possible while preserving performance: 1) Declare the parameters as constants
and define an extra method, or 2) Pass the function as an anonymous closure.

### Constant parameters and new function and gradient methods

The solver requires a function with a single argument, `x`, and a gradient
Expand Down

0 comments on commit a857c9d

Please sign in to comment.