diff --git a/docs/src/manual/differentiability.md b/docs/src/manual/differentiability.md index cccfcdf80..b9b0f9ca0 100644 --- a/docs/src/manual/differentiability.md +++ b/docs/src/manual/differentiability.md @@ -24,30 +24,31 @@ variant (e.g. [`divergence!`](@ref).) To make your code differentiable, you must use the differentiable versions of the operators (without the exclamation marks). -#### Example: Gradient of kinetic energy +### Example: Gradient of kinetic energy To differentiate outputs of a simulation with respect to the initial conditions, make a time stepping loop composed of differentiable operations: -```julia -import IncompressibleNavierStokes as INS +```@example Differentiability +using IncompressibleNavierStokes + ax = range(0, 1, 101) -setup = INS.Setup(; x = (ax, ax), Re = 500.0) -psolver = INS.default_psolver(setup) -method = INS.RKMethods.RK44P2() +setup = Setup(; x = (ax, ax), Re = 500.0) +psolver = default_psolver(setup) +method = RKMethods.RK44P2() Δt = 0.01 nstep = 100 (; Iu) = setup.grid function final_energy(u) - stepper = INS.create_stepper(method; setup, psolver, u, temp = nothing, t = 0.0) + stepper = create_stepper(method; setup, psolver, u, temp = nothing, t = 0.0) for it = 1:nstep - stepper = INS.timestep(method, stepper, Δt) + stepper = timestep(method, stepper, Δt) end (; u) = stepper sum(abs2, u[Iu[1], 1]) / 2 + sum(abs2, u[Iu[2], 2]) / 2 end -u = INS.random_field(setup) +u = random_field(setup) using Zygote g, = Zygote.gradient(final_energy, u) @@ -61,7 +62,6 @@ Now `g` is the gradient of `final_energy` with respect to the initial conditions Note that every operation in the `final_energy` function is non-mutating and thus differentiable. ---- ## Automatic differentiation with Enzyme Enzyme.jl is highly-efficient and its ability to perform AD on optimized code allows Enzyme to meet or exceed the performance of state-of-the-art AD tools. @@ -71,45 +71,47 @@ The downside is that restricts the user's defined f function to not do things li Enzyme's autodiff function can only handle functions with scalar output. To implement pullbacks for array-valued functions, use a mutating function that returns `nothing` and stores its result in one of the arguments, which must be passed wrapped in a Duplicated. In IncompressibleNavierStokes, we provide `enzyme_wrapper` to automatically wrap the function and its arguments in the correct way. -#### Example: Gradient of the right-hand side +### Example: Gradient of the right-hand side In this example we differentiate the right-hand side of the Navier-Stokes equations with respect to the velocity field `u`: -```@example -import IncompressibleNavierStokes as INS +```@example Differentiability using Enzyme ax = range(0, 1, 101) -setup = INS.Setup(; x = (ax, ax), Re = 500.0) -psolver = INS.default_psolver(setup) -u = INS.random_field(setup) +setup = Setup(; x = (ax, ax), Re = 500.0) +psolver = default_psolver(setup) +u = random_field(setup) dudt = similar(u) t = 0.0 -f! = INS.right_hand_side! +f! = right_hand_side! ``` Notice that we are using the mutating (in-place) version of the right-hand side function. This function can not be differentiate by Zygote, which requires the slower non-mutating version of the right-hand side. -We then define the `Dual` part of the input and output, required to store the adjoint values: -```@example +We then define the `Dual` part of the input and output, required to store the adjoint values: + +```@example Differentiability ddudt = Enzyme.make_zero(dudt) .+ 1; du = Enzyme.make_zero(u); ``` Remember that the derivative of the output (also called the *seed*) has to be set to $1$ in order to compute the gradient. In this case the output is the force, that we store mutating the value of `dudt` inside `right_hand_side!`. Then we pack the parameters to be passed to `right_hand_side!`: -```@example + +```@example Differentiability params = [setup, psolver]; params_ref = Ref(params); ``` Now, we call the `autodiff` function from Enzyme: -```@example + +```@example Differentiability Enzyme.autodiff(Enzyme.Reverse, f!, Duplicated(dudt,ddudt), Duplicated(u,du), Const(params_ref), Const(t)) ``` -Since we have passed a `Duplicated` object, the gradient of `u` is stored in `du`. +Since we have passed a `Duplicated` object, the gradient of `u` is stored in `du`. Finally, we can also compare its value with the one obtained by Zygote differentiating the out-of-place (non-mutating) version of the right-hand side: -```@example -using Zygote + +```@example Differentiability f = create_right_hand_side(setup, psolver) -_, zpull = Zygote.pullback(f, u, nothing, T(0)); +_, zpull = Zygote.pullback(f, u, nothing, 0.0); @assert zpull(dudt)[1] == du -``` \ No newline at end of file +``` diff --git a/docs/src/manual/sciml.md b/docs/src/manual/sciml.md index 5d6acc82e..3fcc92b08 100644 --- a/docs/src/manual/sciml.md +++ b/docs/src/manual/sciml.md @@ -28,7 +28,7 @@ This projected right-hand side can be used in the SciML solvers to solve the Nav skip loading all the toolchains for implicit solvers, which takes a while to install on GitHub. -```@example +```@example SciML using OrdinaryDiffEqTsit5 using IncompressibleNavierStokes ax = range(0, 1, 101) @@ -43,10 +43,13 @@ sol = solve(problem, Tsit5(), reltol = 1e-8, abstol = 1e-8) # sol: SciMLBase.ODE Alternatively, it is also possible to use an [in-place formulation](https://docs.sciml.ai/DiffEqDocs/stable/basics/problem/#In-place-vs-Out-of-Place-Function-Definition-Forms) -```@example -du = similar(u) -t = 0.0 +```@example SciML f!(du,u,p,t) = right_hand_side!(du, u, Ref([setup, psolver]), t) +u = similar(u0) +du = similar(u0) +p = nothing +t = 0.0 +f!(du,u,p,t) ``` that is usually faster than the out-of-place formulation.