-
Notifications
You must be signed in to change notification settings - Fork 97
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Gridap and solvers from DifferentialEquations.jl #962
Comments
@JanSuntajs Have you tried replicating/running this test? |
@JordiManyer thanks for the suggestion. I have not yet tried replicating or running the test, however, using Gridap.ODEs.DiffEqWrappers This relates to my original post, namely, the module Edit: perhaps I misunderstood your suggestion initially. I will look into the contents of the |
@JanSuntajs I am using the master branch of Gridap, and the DiffEqWrappers still exists. I don't know if it works, since the test file I pointed at does not run within our CI tests. |
@JordiManyer I am also using the master branch and while the DiffEqWrappers still exists, it is not included in the """
The exported names are
$(EXPORTS)
"""
module ODEs
using DocStringExtensions
include("ODETools/ODETools.jl")
include("TransientFETools/TransientFETools.jl")
# include("DiffEqsWrappers/DiffEqsWrappers.jl")
end #module
const GridapODEs = ODEs
|
@JordiManyer, I have now tried running the said test. Since I could not include # """
# The exported names are
# $(EXPORTS)
# """
module DiffEqWrappers
using Test
using Gridap.ODEs.TransientFETools: TransientFEOperator
using Gridap.ODEs.ODETools: allocate_cache
using Gridap.ODEs.ODETools: update_cache!
using Gridap.ODEs.ODETools: residual!
using Gridap.ODEs.ODETools: jacobians!
using Gridap.ODEs.ODETools: jacobian!
using Gridap.Algebra: allocate_jacobian
using Gridap.FESpaces: get_algebraic_operator
using LinearAlgebra: fillstored!
export prototype_jacobian
export prototype_mass
export prototype_stiffness
export diffeq_wrappers
"""
This method takes a `FEOperator` and returns some methods that can be used
in `DifferentialEquations.jl`. Assuming we want to solve the nonlinear ODE
`res(t,u,du) = 0`, we return:
1. `residual!(res, du, u, p, t)`: It returns the residual (in `res`) at
`(u,du,t)`, following the signature in `DifferentialEquations`.
For the moment, we do not support parameters.
2. `jacobian!(jac, du, u, p, gamma, t)`: Idem for the Jacobian. It returns
`∂res/∂du*γ + ∂res/∂u`
3. `mass!(mass, du, u, p, t)`: Idem for the mass matrix. It returns
`∂res/∂du`
4. `stiffness!(stif, du, u, p, t)`: Idem for the mass matrix. It returns
`∂res/∂u`
"""
function diffeq_wrappers(op)
ode_op = get_algebraic_operator(op)
ode_cache = allocate_cache(ode_op)
function _residual!(res, du, u, p, t)
# TO DO (minor): Improve update_cache! st do nothing if same time t as in the cache
# now it would be done twice (residual and jacobian)
ode_cache = update_cache!(ode_cache, ode_op, t)
residual!(res, ode_op, t, (u, du), ode_cache)
end
function _jacobian!(jac, du, u, p, gamma, t)
ode_cache = update_cache!(ode_cache, ode_op, t)
z = zero(eltype(jac))
fillstored!(jac, z)
jacobians!(jac, ode_op, t, (u, du), (1.0, gamma), ode_cache)
end
function _mass!(mass, du, u, p, t)
ode_cache = update_cache!(ode_cache, ode_op, t)
z = zero(eltype(mass))
fillstored!(mass, z)
jacobian!(mass, ode_op, t, (u, du), 2, 1.0, ode_cache)
end
function _stiffness!(stif, du, u, p, t)
ode_cache = update_cache!(ode_cache, ode_op, t)
z = zero(eltype(stif))
fillstored!(stif, z)
jacobian!(stif, ode_op, t, (u, du), 1, 1.0, ode_cache)
end
return _residual!, _jacobian!, _mass!, _stiffness!
end
"""
It allocates the Jacobian (or mass or stiffness) matrix, given the `FEOperator`
and a vector of size total number of unknowns
"""
function prototype_jacobian(op::TransientFEOperator, u0)
ode_op = get_algebraic_operator(op)
ode_cache = allocate_cache(ode_op) # Not acceptable in terms of performance
return allocate_jacobian(ode_op, u0, ode_cache)
end
const prototype_mass = prototype_jacobian
const prototype_stiffness = prototype_jacobian
end #module Then, I try running the test, which returns an error at the last line in the code below: using Gridap
using Gridap.ODEs
using Gridap.ODEs.ODETools
using Gridap.ODEs.TransientFETools
using MyProject.DiffEqWrappers
# using DifferentialEquations
# using Sundials
using Gridap.Algebra: NewtonRaphsonSolver
using Base.Iterators
# FE problem (heat eq) using Gridap
function fe_problem(u, n)
f(t) = x -> ∂t(u)(x, t) - Δ(u(t))(x)
domain = (0, 1, 0, 1)
partition = (n, n)
model = CartesianDiscreteModel(domain, partition)
order = 1
reffe = ReferenceFE(lagrangian,Float64,order)
V0 = FESpace(
model,
reffe,
conformity = :H1,
dirichlet_tags = "boundary",
)
U = TransientTrialFESpace(V0, u)
Ω = Triangulation(model)
degree = 2 * order
dΩ = Measure(Ω, degree)
a(u, v) = ∫( ∇(v) ⋅ ∇(u) )dΩ
b(v, t) = ∫( v * f(t) )dΩ
m(u, v) = ∫( v * u )dΩ
res(t, u, v) = a(u, v) + m(∂t(u), v) - b(v, t)
jac(t, u, du, v) = a(du, v)
jac_t(t, u, dut, v) = m(dut, v)
op = TransientFEOperator(res, jac, jac_t, U, V0)
U0 = U(0.0)
uh0 = interpolate_everywhere(u(0.0), U0)
u0 = get_free_dof_values(uh0)
return op, u0
end
# Solving the heat equation using Gridap.ODEs and DiffEqs
tspan = (0.0, 1.0)
u(x, t) = t
u(t) = x -> u(x, t)
# ISSUE 1: When I choose n > 2, even though the problem that we will solve is
# linear, the Sundials solvers seems to have convergence issues in the nonlinear
# solver (?). Ut returns errors
# [IDAS ERROR] IDACalcIC Newton/Linesearch algorithm failed to converge.
# ISSUE 2: When I pass `jac_prototype` the code gets stuck
n = 3 # cells per dim (2D)
op, u0 = fe_problem(u,n)
res!, jac!, mass!, stif! = diffeq_wrappers(op)
J = prototype_jacobian(op,u0) The error relates to the function Has the interface for These are the packages I am currently using in my project: Regards, |
@JordiManyer I got back to this problem after some time and here's what I found: I can get the test example working, but only after commenting out the
Then, in the # # To explore the Sundials solver options, e.g., BE with fixed time step dtd
f_iip = DAEFunction{true}(res!)#, jac_prototype=J) I suppose I need an appropriate interface to pass the assembled Jacobian to the Regards, |
To offer some update regarding my progress on the issue, I am appending a MWE of a working script that simulates diffusion on a sphere which we initially uniformly fill with an incoming flux and then stop the filling altogether: """
A script providing a minimal working example showcasing simple
diffusion simulated using both `Gridap` and ˙Sundials˙ solvers.
"""
# using MKL
import Sundials as snd
# Gridap-based tools
using Gridap
import GridapGmsh as ggmsh
import Gridap.Geometry as ggeo
using Gridap.ODEs
using Gridap.ODEs.ODETools
using Gridap.ODEs.TransientFETools
include("./DiffEqWrappers.jl")
using .DiffEqWrappers: diffeq_wrappers, prototype_jacobian
const meshname = "./sphere_shape_1.0_lc_0.1_.msh"
const dtensor = TensorValue([1. 0. 0.; 0. 1. 0.; 0. 0. 1.])
# timespan
t₀ = 0.
tf = 10
tfill = 5
# boundary flux
function bflux(x, t::Real)
if (t <= tfill)
return 1. / (3 * tfill)
else
return 0.
end
end # bflux
bflux(t::Real) = x -> bflux(x, t)
function weak_form(Ω, dΩ, dΓ, n_Γ, D, flux)
# chemical residual part
res(t, w, z) = ∫((z ⋅ ∂t(w)) + ((D ⋅ ∇(w)) ⋅ ∇(z))) * dΩ - ∫(flux(t) ⋅ z) * dΓ
# jacobian
jac(t, w, dw, z) = ∫((D ⋅ ∇(dw)) ⋅ ∇(z)) * dΩ
jac_t(t, w, dwₜ, z) = ∫(dwₜ ⋅ z) * dΩ
return res, jac, jac_t
end
# initial conc
cd0(x) = 0.
function main()
# ------------------------------------
#
# IMPORT MODEL
#
# ------------------------------------
model = DiscreteModelFromFile(meshname)
# -------------------------------------
#
# DEFINE SPACES
#
# -------------------------------------
refFE = ReferenceFE(lagrangian,Float64,1)
V= TestFESpace(model,refFE, conformity=:H1, # no dirichlet
)
U = TransientTrialFESpace(V, )#g1_c) # [g1_c, g1_c])
order = 1
degree = 2 * order
Ω = Triangulation(model)
dΩ = Measure(Ω, degree)
neumann_tags = ["Casing"] #["Casing", "Left edge", "Right edge"]
Γ = BoundaryTriangulation(model, tags=neumann_tags)
dΓ = Measure(Γ, degree)
n_Γ = get_normal_vector(Γ)
# -------------------------------------
#
# DEFINE OPERATORS
#
# -------------------------------------
res, jac, jac_t = weak_form(Ω, dΩ, dΓ, n_Γ, dtensor, bflux)
op = TransientFEOperator(res,jac,jac_t,U,V)
c0 = interpolate_everywhere(cd0, U(0.))
csnd = get_free_dof_values(c0)
res!, jac!, _, _ = diffeq_wrappers(op)
J_ = prototype_jacobian(op, csnd)
r = copy(csnd)
θ = 0.1; dt = 0.1; tθ = 0.0; dtθ = dt*θ
res!(r, csnd, csnd, [], tθ)
jac!(J_, csnd, csnd, [], (1 / dtθ), tθ)
tspan = (t₀, tf)
# ----------------------------------------------------
#
#
# SUNDIALS phase
#
#
# # To explore the Sundials solver options, e.g., BE with fixed time step dtd
# println(cusnd)
println("Entering solver phase!")
diff_vars = fill(true, length(csnd))
f_iip = snd.DAEFunction{true}(res!;jac_prototype=J_, jac=jac!)
#
prob_iip = snd.DAEProblem{true}(f_iip, csnd, csnd, tspan, (), differential_vars=diff_vars);
sol_iip = snd.solve(prob_iip, snd.IDA(linear_solver=:KLU, init_all=false); dtmax=1.,
abstol=1e-06, reltol=1e-8, saveat=collect(t₀:0.1:tf), tstops=[tfill])
# -------------------------------------
#
# SAVING RESULTS
#
# -------------------------------------
println("Saving results")
savename_result = "mwe_result"
savefolder = "./results"
if !ispath(savefolder)
mkpath(savefolder)
end
createpvd("$savefolder/$(savename_result)") do pvd
for i in eachindex(sol_iip.t)
# println(i)
t = sol_iip.t[i]
cₕ = FEFunction(U(t), sol_iip.u[i], get_dirichlet_dof_values(U(t)))
pvd[t] = createvtk(Ω, "$(savefolder)/$(savename_result)_t_$t.vtu",
cellfields=["ch" => cₕ, "gradc" => ∇(cₕ), "flux" => -dtensor ⋅ ∇(cₕ)
])
end
end
end # main
if abspath(PROGRAM_FILE) == abspath(@__FILE__)
main()
end I am also interested in whether I am doing the interpolation back to the computational grid in the saving phase in an optimal matter - specifically, I am referring to this line:
The aditional files (the Edit: after fixing some issues with interpolation of results in the multiphysics case, it seems the same approach can be extended to a transient multiphysics case as well. If anyone is interested in that, I can perhaps prepare a MWE for that case as well, but it is indeed pretty straightforward once the operators are defined. |
@JordiManyer @santiagobadia @fverdugo In the light of the recent refactoring of the ODE module, I would like to know how to get my above code (working with Gridap v17.23.0) to work with Gridap v>=18.0.0. Specifically, I am particularly interested in what are now the analogs of the following functions: using Gridap.ODEs.TransientFETools: TransientFEOperator
using Gridap.ODEs.ODETools: allocate_cache
using Gridap.ODEs.ODETools: update_cache!
using Gridap.ODEs.ODETools: residual!
using Gridap.ODEs.ODETools: jacobians!
using Gridap.ODEs.ODETools: jacobian!
Regards, Jan |
In the new version of the ODE module, the residuals and jacobians are no longer evaluated at the
The best way to get a feel for how these functions work together is to have a look at |
Hello,
I am using$\Delta t$ forward in time. It should probably also be evident from the above that the problem is a multiphysics one.
Gridap
to solve a coupled system of differential and algebraic equations (DAE). Namely, I am dealing with thediffusion equation in which the concentration field is coupled to some external fields that change on time scales much faster
than that of the diffusive problem. Currently, I explicitly discretize the time-propagation scheme and use an iterative scheme to propagate the solution a timestep
For speed and convergence reasons, I would like to use a specialized DAE solver instead, such as the ones available in
DifferentialEquations.jl suite. Is there a way to achieve this in
Gridap
?I stumbled upon a similar issue in GridapODEs where some attempt at a solution has been made within the module
DiffEqsWrappers
. Could I achieve the desired outcome if I went along the same lines? I've noticed that since integration ofGridapODEs
toGridap
the moduleDiffEqsWrappers
is no longer used:Regards,
Jan
The text was updated successfully, but these errors were encountered: