Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

EXPECTED FAILURE: ERP_P72x2_Ld30.f45_f45_mg37.I2000Clm51FatesRs.cheyenne_intel.clm-mimicsFatesCold #2261

Open
slevis-lmwg opened this issue Nov 21, 2023 · 2 comments
Assignees
Labels
bug something is working incorrectly testing additions or changes to tests

Comments

@slevis-lmwg
Copy link
Contributor

slevis-lmwg commented Nov 21, 2023

Brief summary of bug

ERP_P72x2_Ld30.f45_f45_mg37.I2000Clm51FatesRs.cheyenne_intel.clm-mimicsFatesCold
has been failing at SETUP (previously as a clm50 test and as a clm51 test as of today)
with Fire emission can NOT be on when FATES is also on.
This is an EXPECTED FAILURE. I looked into it a bit and wanted to post thoughts on fixing it.

General bug information

CTSM version you are using: [output of git describe]
ctsm5.1.dev153

Does this bug cause significantly incorrect results in the model's science?
No, but it prevents testing of mimics/fates.

Configurations affected:
mimics and fates on at the same time.

Details of bug

Fixing this may require pointing the mimics testmod to /basic instead of /default to avoid setting -fire_emis; however, the fix would affect all mimics tests, so I wanted to look more carefully whether this is ok to do.

@slevis-lmwg slevis-lmwg added bug something is working incorrectly testing additions or changes to tests labels Nov 21, 2023
@slevis-lmwg slevis-lmwg self-assigned this Nov 21, 2023
@slevis-lmwg
Copy link
Contributor Author

I found additional problem-solving notes for this failing test in this issue.

@slevis-lmwg
Copy link
Contributor Author

slevis-lmwg commented Feb 14, 2024

UPDATE:

  1. The test is now ERP_P256x2_Ld30.f45_f45_mg37.I2000Clm51FatesRs.derecho_intel.clm-mimicsFatesCold
  2. The test builds successfully in the ctsm5.2 branch (see Workaround for transient Smallville tests #1673 + testing all new datasets #2318)
  3. We're on derecho now and the test fails at runtime with a "CH4 Conservation Error in CH4Mod during diffusion"

I would first remove the P256x2 and try again in case the error message is a "red herring."

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug something is working incorrectly testing additions or changes to tests
Projects
None yet
Development

No branches or pull requests

1 participant