Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Storage cache refinements #808

Open
wants to merge 5 commits into
base: master
Choose a base branch
from
Open
Changes from 1 commit
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
39 changes: 20 additions & 19 deletions opm/models/discretization/common/fvbaselocalresidual.hh
Original file line number Diff line number Diff line change
Expand Up @@ -535,25 +535,26 @@ protected:
if (elemCtx.enableStorageCache()) {
const auto& model = elemCtx.model();
unsigned globalDofIdx = elemCtx.globalSpaceIndex(dofIdx, /*timeIdx=*/0);
for (unsigned eqIdx = 0; eqIdx < numEq; ++ eqIdx) {
tmp2[eqIdx] = Toolbox::value(tmp[eqIdx]);
}
if(elemCtx.problem().recycleFirstIterationStorage()) {
if (model.newtonMethod().numIterations() == 0)
{
// if the storage term is cached and we're in the first iteration
// of the time step, use the storage term of the first iteration
// as the one as the solution of the last time step (this assumes
// that the initial guess for the solution at the end of the time
// step is the same as the solution at the beginning of the time
// step. This is usually true, but some fancy preprocessing
// scheme might invalidate that assumption.)
Valgrind::CheckDefined(tmp2);
model.updateCachedStorage(globalDofIdx, /*timeIdx=*/1, tmp2);
}
} else {
model.updateCachedStorage(globalDofIdx, /*timeIdx=*/0, tmp2);
}
for (unsigned eqIdx = 0; eqIdx < numEq; ++eqIdx) {
tmp2[eqIdx] = Toolbox::value(tmp[eqIdx]);
}
if (!elemCtx.haveStashedIntensiveQuantities()) {
Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Huh, interesting. I'd not come across that function before. Its documentation is a little hard to comprehend though, since it states that the function

returns true if NO intensive quantities are stashed

Is that accurate?

Copy link
Member Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I have also not really come across it. The mechanism is used to stash away data for a cell before perturbing the element context for finite difference derivative evaluations, and then restoring it afterwards, so its removal from the logic made such cases fail.

The doc is wrong...

if (elemCtx.problem().recycleFirstIterationStorage()) {
if (model.newtonMethod().numIterations() == 0) {
// if the storage term is cached and we're in the first iteration
// of the time step, use the storage term of the first iteration
// as the one as the solution of the last time step (this assumes
// that the initial guess for the solution at the end of the time
// step is the same as the solution at the beginning of the time
// step. This is usually true, but some fancy preprocessing
// scheme might invalidate that assumption.)
Valgrind::CheckDefined(tmp2);
model.updateCachedStorage(globalDofIdx, /*timeIdx=*/1, tmp2);
}
} else {
model.updateCachedStorage(globalDofIdx, /*timeIdx=*/0, tmp2);
}
}
// if the storage term at the beginning of the time step is cached
// from the last time step or we're not looking at the first
// iteration of the time step, we take the cached data.
Expand Down