Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add option to reset global time after initial decay #136

Open
wants to merge 2 commits into
base: main
Choose a base branch
from

Conversation

ManuelHu
Copy link
Collaborator

@ManuelHu ManuelHu commented Oct 22, 2024

see #135

@ManuelHu
Copy link
Collaborator Author

ManuelHu commented Nov 13, 2024

@tdixon97 I also added a warning for large global times after ~285 years of simulation time. This corresponds to 1 us time resolution:

std::cout << std::nextafter(285 * CLHEP::year, INFINITY) - 285 * CLHEP::year) / CLHEP::us << std::endl;
// -> 1.024

the warning is only shown once per thread to avoid it being shown for each event. It is also not shown after the initial decay, if times of secondaries are reset to zero (the new option in this PR)

@tdixon97
Copy link

I think the warning is good but then I wonder if we want the limit by default to be set to this?

@ManuelHu
Copy link
Collaborator Author

that would give us non-deterministic, inconsistent and for users even more confusing behaviour: the limit does affect the true (sampled) lifetime, and not the half-life. So some decays would go through, and some not.
also it applies to the time of the individual decays, and not the (summed) global time in the event. So 3 decays with 100 years lifetime would not trigger the threshold, but only this warning.

Also I would suppose that - when I want to simulate some specific decay - remage should also just do how it is instructed.

@tdixon97
Copy link

tdixon97 commented Nov 13, 2024

Hmm I am still not sure, I think remage should perform some sanity checks and make the user force it to do something that would give a nonsense output. What do you think @gipert ? Maybe the check should be on the half-lives not on the sampled half-lives as you say

@gipert
Copy link
Member

gipert commented Nov 15, 2024

I think the check should be on half-lives, not the sampled decay times. Will anyone really want to cut on the sampled times?

@tdixon97
Copy link

I agree, cutting on sampled times is not a nice behaviour really

@ManuelHu
Copy link
Collaborator Author

probably not. This is why this PR is raising that builtin limit acting on sampled lifetimes. But we cannot really disable this functionality. (also because the API to set the limit from C++ only exists in 11.2+, and not in 11.0-11.1).

But this is a more complex issue, as Toby wants to enforce a maximum time uncertainty in output.
This cannot achieved by cutting on individual decay half-lifes. It would have to be a check on the sum of all half-lives in a decay chain. But this information is not readily available in Geant4, so we would need to track it on our own.

Such a check would give us reproducability, but would not really solve the output timing issue? If a chain approaches this 285yr region, a lot of events would still have time uncertainty > 1us even after that cut...
Example: Ar39 has 268yr half-life...


or we just want an option to constrain half-lifes of daughter nuclei (not summed); we already have this option. I think that having options for primaries do not make sense at all - if a user requests a specific isotope, they should already know best that they really want it...

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants