Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bump dependencies. #309

Merged
merged 11 commits into from
Oct 25, 2023
48 changes: 48 additions & 0 deletions Changelog.md
Original file line number Diff line number Diff line change
Expand Up @@ -5,6 +5,54 @@ All notable Changes to the Julia package `Manopt.jl` will be documented in this
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.0.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).

## [0.4.40] – 24/10/2023

## Changes

* Bump dependencies to `ManifoldsBase.jl` 0.15 and `Manifolds.jl` 0.9
* move the ARC CG subsolver to the main package, since `TangentSpace` is now already
available from `ManifoldsBase`.

## [0.4.39] – 02/09/2023
kellertuer marked this conversation as resolved.
Show resolved Hide resolved

## Changes
* also use the pair of a retraction and the inverse retraction (see last update)
to perform the relaxation within the Douglas-Rachford algorithm.

## [0.4.38] – 08/10/2023

### Changes

* Fix a lot of typos in the documentation
* avoid allocations when calling `get_jacobian!` within the Levenberg-Marquard Algorithm.

## [0.4.37] – 02/09/2023
kellertuer marked this conversation as resolved.
Show resolved Hide resolved

### Changes

* add more of the Riemannian Levenberg-Marquard algorithms parameters as keywords, so they
can be changed on call
* generalize the internal reflection of Douglas-Rachford, such that is also works with an
arbitrary pair of a reflection and an inverse reflection.

## [0.4.36] – 20/09/2023


## [0.4.35] – 14/09/2023

### Added

* The access to functions of the objective is now unified and encapsulated in proper `get_`
functions.

## [0.4.34] – 02/09/2023

### Added

* an `ManifoldEuclideanGradientObjetive` to allow the cost, gradient, and Hessian and other
first or second derivative based elements to be Euclidean and converted when needed.
* a keyword `objective_type=:Euclidean` for all solvers, that specifies that an Objective shall be created of the above type

## [0.4.33] - 24/08/2023

### Added
Expand Down
6 changes: 3 additions & 3 deletions Project.toml
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
name = "Manopt"
uuid = "0fc0a36d-df90-57f3-8f93-d78a9fc72bb5"
authors = ["Ronny Bergmann <[email protected]>"]
version = "0.4.39"
version = "0.4.40"

[deps]
ColorSchemes = "35d6a980-a343-548e-a6ea-1d62b119f2f4"
Expand Down Expand Up @@ -40,8 +40,8 @@ Colors = "0.11.2, 0.12"
DataStructures = "0.17, 0.18"
LRUCache = "1.4"
ManifoldDiff = "0.2, 0.3.3"
Manifolds = "0.8.75"
ManifoldsBase = "0.14.10"
Manifolds = "0.9"
ManifoldsBase = "0.15"
PolynomialRoots = "1"
Requires = "0.5, 1"
julia = "1.6"
Expand Down
4 changes: 2 additions & 2 deletions docs/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -23,5 +23,5 @@ Random = "9a3f8284-a2c9-5f02-9a11-845980a1fd5c"
BenchmarkTools = "1.3"
CondaPkg = "0.2"
Documenter = "0.27"
kellertuer marked this conversation as resolved.
Show resolved Hide resolved
Manifolds = "0.8.75"
ManifoldsBase = "0.13, 0.14"
Manifolds = "0.8.81, 0.9"
ManifoldsBase = "0.14.12, 0.15"
2 changes: 1 addition & 1 deletion docs/src/tutorials/HowToDebug.md
Original file line number Diff line number Diff line change
Expand Up @@ -74,7 +74,7 @@ There is two more advanced variants that can be used. The first is a tuple of a

We can for example change the way the `:ϵ` is printed by adding a format string
and use [`DebugCost`](@ref)`()` which is equivalent to using `:Cost`.
Especially with the format change, the lines are more consistent in length.
Especially with the format change, the lines are more coniststent in length.
kellertuer marked this conversation as resolved.
Show resolved Hide resolved

``` julia
p2 = exact_penalty_method(
Expand Down
46 changes: 0 additions & 46 deletions ext/ManoptManifoldsExt/ARC_CG.jl

This file was deleted.

1 change: 0 additions & 1 deletion ext/ManoptManifoldsExt/ManoptManifoldsExt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -30,5 +30,4 @@ include("nonmutating_manifolds_functions.jl")
include("artificialDataFunctionsManifolds.jl")
include("ChambollePockManifolds.jl")
include("alternating_gradient.jl")
include("ARC_CG.jl")
end
3 changes: 3 additions & 0 deletions src/Manopt.jl
Original file line number Diff line number Diff line change
Expand Up @@ -69,8 +69,10 @@ using ManifoldsBase:
NestedPowerRepresentation,
ParallelTransport,
PowerManifold,
ProductManifold,
ProjectionTransport,
QRRetraction,
TangentSpace,
^,
_read,
_write,
Expand Down Expand Up @@ -123,6 +125,7 @@ using ManifoldsBase:
set_component!,
shortest_geodesic,
shortest_geodesic!,
submanifold_components,
vector_transport_to,
vector_transport_to!,
zero_vector,
Expand Down
46 changes: 46 additions & 0 deletions src/solvers/adaptive_regularization_with_cubics.jl
Original file line number Diff line number Diff line change
Expand Up @@ -839,3 +839,49 @@ function show(io::IO, c::StopWhenAllLanczosVectorsUsed)
"StopWhenAllLanczosVectorsUsed($(repr(c.maxLanczosVectors)))\n $(status_summary(c))",
)
end
#
#
function set_manopt_parameter!(M::TangentSpace, ::Val{:p}, v)
M.point .= v
return M
end
function (f::Manopt.AdaptiveRegularizationCubicCost)(M::TangentSpace, X)
## (33) in Agarwal et al.
return get_cost(base_manifold(M), f.mho, M.point) +
inner(base_manifold(M), M.point, X, f.X) +
1 / 2 * inner(
base_manifold(M),
M.point,
X,
get_hessian(base_manifold(M), f.mho, M.point, X),
) +
f.σ / 3 * norm(base_manifold(M), M.point, X)^3
end
function (grad_f::Manopt.AdaptiveRegularizationCubicGrad)(M::TangentSpace, X)
# (37) in Agarwal et
return grad_f.X +
get_hessian(base_manifold(M), grad_f.mho, M.point, X) +
grad_f.σ * norm(base_manifold(M), M.point, X) * X
end
function (grad_f::Manopt.AdaptiveRegularizationCubicGrad)(M::TangentSpace, Y, X)
get_hessian!(base_manifold(M), Y, grad_f.mho, M.point, X)
Y .= Y + grad_f.X + grad_f.σ * norm(base_manifold(M), M.point, X) * X
return Y
end
function (c::StopWhenFirstOrderProgress)(
dmp::AbstractManoptProblem{<:TangentSpace}, ams::AbstractManoptSolverState, i::Int
)
if (i == 0)
c.reason = ""
return false
end
#Update Gradient
TpM = get_manifold(dmp)
nG = norm(base_manifold(TpM), TpM.point, get_gradient(dmp, ams.p))
nX = norm(base_manifold(TpM), TpM.point, ams.p)
if (i > 0) && (nG <= c.θ * nX^2)
c.reason = "The algorithm has reduced the model grad norm by $(c.θ).\n"
return true
end
return false
end
2 changes: 1 addition & 1 deletion test/solvers/test_adaptive_regularization_with_cubics.jl
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ include("../utils/example_tasks.jl")
Hess_f(M, p, X) = -A * X + p * p' * A * X + X * p' * A * p

p0 = Matrix{Float64}(I, n, n)[:, 1:k]
M2 = TangentSpaceAtPoint(M, p0)
M2 = TangentSpace(M, p0)

mho = ManifoldHessianObjective(f, grad_f, Hess_f)
g = AdaptiveRegularizationCubicCost(M2, mho)
Expand Down
4 changes: 2 additions & 2 deletions tutorials/Project.toml
Original file line number Diff line number Diff line change
Expand Up @@ -18,7 +18,7 @@ FiniteDifferences = "0.12"
IJulia = "1"
LRUCache = "1.4"
ManifoldDiff = "0.3"
Manifolds = "0.8.75"
ManifoldsBase = "0.14.5"
Manifolds = "0.8.81, 0.9"
ManifoldsBase = "0.14.12, 0.15"
Manopt = "0.4.22"
Plots = "1.38"