From 4071656a55613ac1b9e082a8ccb7bb71eda8d0fb Mon Sep 17 00:00:00 2001 From: Annika Lauber Date: Tue, 15 Oct 2024 12:26:33 +0200 Subject: [PATCH] Update local testing --- docs/models/icon/large_use_cases.md | 17 ++++++++++++----- 1 file changed, 12 insertions(+), 5 deletions(-) diff --git a/docs/models/icon/large_use_cases.md b/docs/models/icon/large_use_cases.md index 8ae86ab7..09009a73 100644 --- a/docs/models/icon/large_use_cases.md +++ b/docs/models/icon/large_use_cases.md @@ -46,15 +46,22 @@ and integrate in the ICON testing infrastructure with a low number of grid point and a few time steps (about 6). Existing use cases like the [Aquaplanet :material-open-in-new:](https://gitlab.dkrz.de/icon/icon-nwp/-/blob/master/run/exp.exclaim_ape_R02B04){:target="_blank"} one can serve as a template. Your test case should be saved as `run/exp.`. ### 1.2 Local testing -1. Add a checksuite file of your experiment under `run/checksuite.icon-dev/exp.` (use one of the other check files as a template) -2. Add experiment/test settings under `scripts/experiments//*.yml`: - - Have a look at `scripts/experiments/c2sm/mch_experiments.yml`: - - `type` should contain all file IDs, which are set in each `output_nml` under `output_filename` as unique identifier in your test case +Follow the step-by-step guide in [How to add experiments to a buildbot list](https://gitlab.dkrz.de/icon/wiki/-/wikis/How-to-setup-new-test-experiments-for-buildbot#how-to-add-experiments-to-a-buildbot-list) to add you experiment test case. Start with the `checksuite_modes` for the mpi and nproma test (`'nm'`) for the machine you are testing on. #### Test on CPU -To ensure that there are no basic issues with the namelist, we recommend to start testing on CPU before going over to GPU testing. +To ensure that there are no basic issues with the namelist, we recommend to start testing on CPU before going over to GPU testing. First, compile icon-nwp on CPU following the instructions in [Configure and compile](https://c2sm.github.io/models/icon/#configure-and-compile) (*TODO*: fix link before merging). Then create the check file and run the test locally (`EXP=`): + +```bash +./make_runscripts ${EXP} +./run/make_target_runscript in_script=checksuite.icon-dev/check.${EXP} in_script=exec.iconrun out_script=check.${EXP}.run EXPNAME=${EXP} +cd run +sbatch --partition debug --time 00:30:00 check.${EXP}.run +``` + +Check in the LOG file if all tests passed. #### Test on GPU +If all tests are validating on CPU, the next step is to test on GPU. First, compile icon-nwp on GPU. Then create the check file and run the mpi and nproma test locally as above. If those tests also validate on GPU, you can continue with the tolerance test to ensure that running on GPU gives basically the same results as running on CPU. Therefore, change the `checksuite_mode` to `'t'` for the tolerance test and follow the instructions in [Validating with probtest without buildbot references (Generating tolerances for non standard tests)](https://gitlab.dkrz.de/icon/wiki/-/wikis/GPU-development/Validating-with-probtest-without-buildbot-references-(Generating-tolerances-for-non-standard-tests)). ### 1.3 Activate Test in a CI Pipeline