Skip to content

Commit

Permalink
Merge branch 'NOAA-EMC:develop' into csps-rocky8
Browse files Browse the repository at this point in the history
  • Loading branch information
weihuang-jedi authored Dec 27, 2024
2 parents c157e5e + 1c37f90 commit 917b792
Show file tree
Hide file tree
Showing 50 changed files with 1,370 additions and 1,026 deletions.
1 change: 1 addition & 0 deletions .github/CODEOWNERS
Validating CODEOWNERS rules …
Original file line number Diff line number Diff line change
Expand Up @@ -211,3 +211,4 @@ ush/python/pygfs/utils/marine_da_utils.py @guillaumevernieres @AndrewEichmann-NO

# Specific workflow scripts
workflow/generate_workflows.sh @DavidHuber-NOAA
workflow/build_compute.py @DavidHuber-NOAA @aerorahul
3 changes: 3 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
Expand Up @@ -85,6 +85,9 @@ parm/wafs

# Ignore sorc and logs folders from externals
#--------------------------------------------
sorc/build.xml
sorc/build.db
sorc/build_lock.db
sorc/*log
sorc/logs
sorc/calc_analysis.fd
Expand Down
4 changes: 1 addition & 3 deletions ci/Jenkinsfile
Original file line number Diff line number Diff line change
Expand Up @@ -120,9 +120,7 @@ pipeline {
def error_logs_message = ""
dir("${HOMEgfs}/sorc") {
try {
sh(script: './build_all.sh -kgu') // build the global-workflow executables for GFS variant (UFS-wx-model, WW3 pre/post executables)
sh(script: './build_ww3prepost.sh -w > ./logs/build_ww3prepost_gefs.log 2>&1') // build the WW3 pre/post processing executables for GEFS variant
sh(script: './build_ufs.sh -w -e gefs_model.x > ./logs/build_ufs_gefs.log 2>&1') // build the UFS-wx-model executable for GEFS variant
sh(script: './build_compute.sh all') // build the global-workflow executables
} catch (Exception error_build) {
echo "Failed to build global-workflow: ${error_build.getMessage()}"
if ( fileExists("logs/error.logs") ) {
Expand Down
56 changes: 19 additions & 37 deletions docs/source/clone.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,35 +18,39 @@ Clone the `global-workflow` and `cd` into the `sorc` directory:
git clone --recursive https://github.com/NOAA-EMC/global-workflow
cd global-workflow/sorc

For forecast-only (coupled or uncoupled) build of the components:
.. _build_examples:

The build_all.sh script can be used to build all required components of the global workflow. The accepted arguments is a list of systems to be built. This includes builds for GFS and GEFS forecast-only experiments, GSI and GDASApp-based DA for cycled GFS experiments. See `feature availability <hpc.html#feature-availability-by-hpc>`__ to see which system(s) are available on each supported system.

::

./build_all.sh
./build_all.sh [gfs] [gefs] [gs] [gdas] [all]

For cycled (w/ data assimilation) use the `-g` option during build:
For example, to run GFS experiments with GSI DA, execute:

::

./build_all.sh -g
./build_all.sh gfs gsi

For coupled cycling (include new UFSDA) use the `-gu` options during build:
This builds the GFS, UFS-utils, GFS-utils, WW3 with PDLIB (structured wave grids), UPP, GSI, GSI-monitor, and GSI-utils executables.

[Currently only available on Hera, Orion, and Hercules]
For coupled cycling (include new UFSDA) execute:

::

./build_all.sh -gu
./build_all.sh gfs gdas

This builds all of the same executables, except it builds the GDASApp instead of the GSI.

For building without PDLIB (unstructured grid) for the wave model, use the `-w` options during build:
To run GEFS (forecast-only) execute:

::

./build_all.sh -w
./build_all.sh gefs

This builds the GEFS, UFS-utils, GFS-utils, WW3 *without* PDLIB (unstructure wave grids), and UPP executables.

Build workflow components and link workflow artifacts such as executables, etc.
Once the building is complete, link workflow artifacts such as executables, configuration files, and scripts via

::

Expand Down Expand Up @@ -107,40 +111,19 @@ Under the ``/sorc`` folder is a script to build all components called ``build_al

::

./build_all.sh [-a UFS_app][-g][-h][-u][-v]
./build_all.sh [-a UFS_app][-k][-h][-v] [list of system(s) to build]
-a UFS_app:
Build a specific UFS app instead of the default
-g:
Build GSI
-k:
Kill all builds immediately if one fails
-h:
Print this help message and exit
-j:
Specify maximum number of build jobs (n)
-u:
Build UFS-DA
-v:
Execute all build scripts with -v option to turn on verbose where supported

For forecast-only (coupled or uncoupled) build of the components:

::

./build_all.sh

For cycled (w/ data assimilation) use the `-g` option during build:

::

./build_all.sh -g

For coupled cycling (include new UFSDA) use the `-gu` options during build:

[Currently only available on Hera, Orion, and Hercules]

::

./build_all.sh -gu
Lastly, pass to build_all.sh a list of systems to build. This includes `gfs`, `gefs`, `sfs` (not fully supported), `gsi`, `gdas`, and `all`.

For examples of how to use this script, see :ref:`build examples <build_examples>`.

^^^^^^^^^^^^^^^
Link components
Expand All @@ -156,4 +139,3 @@ After running the checkout and build scripts run the link script:

Where:
``-o``: Run in operations (NCO) mode. This creates copies instead of using symlinks and is generally only used by NCO during installation into production.

10 changes: 6 additions & 4 deletions env/HERA.env
Original file line number Diff line number Diff line change
Expand Up @@ -115,14 +115,16 @@ elif [[ "${step}" = "snowanl" ]]; then
export APRUN_CALCFIMS="${launcher} -n 1"

export NTHREADS_SNOWANL=${NTHREADSmax}
export APRUN_SNOWANL="${APRUN_default} --cpus-per-task=${NTHREADS_SNOWANL}"
export APRUN_SNOWANL="${APRUN_default} --mem=0 --cpus-per-task=${NTHREADS_SNOWANL}"

export APRUN_APPLY_INCR="${launcher} -n 6"

elif [[ "${step}" = "esnowrecen" ]]; then
elif [[ "${step}" = "esnowanl" ]]; then

export NTHREADS_ESNOWRECEN=${NTHREADSmax}
export APRUN_ESNOWRECEN="${APRUN_default} --cpus-per-task=${NTHREADS_ESNOWRECEN}"
export APRUN_CALCFIMS="${launcher} -n 1"

export NTHREADS_ESNOWANL=${NTHREADSmax}
export APRUN_ESNOWANL="${APRUN_default} --mem=0 --cpus-per-task=${NTHREADS_ESNOWANL}"

export APRUN_APPLY_INCR="${launcher} -n 6"

Expand Down
8 changes: 5 additions & 3 deletions env/HERCULES.env
Original file line number Diff line number Diff line change
Expand Up @@ -118,10 +118,12 @@ case ${step} in

export APRUN_APPLY_INCR="${launcher} -n 6"
;;
"esnowrecen")
"esnowanl")

export NTHREADS_ESNOWRECEN=${NTHREADSmax}
export APRUN_ESNOWRECEN="${APRUN_default} --cpus-per-task=${NTHREADS_ESNOWRECEN}"
export APRUN_CALCFIMS="${launcher} -n 1"

export NTHREADS_ESNOWANL=${NTHREADSmax}
export APRUN_ESNOWANL="${APRUN_default} --cpus-per-task=${NTHREADS_ESNOWANL}"

export APRUN_APPLY_INCR="${launcher} -n 6"
;;
Expand Down
8 changes: 5 additions & 3 deletions env/JET.env
Original file line number Diff line number Diff line change
Expand Up @@ -102,10 +102,12 @@ elif [[ "${step}" = "snowanl" ]]; then

export APRUN_APPLY_INCR="${launcher} -n 6"

elif [[ "${step}" = "esnowrecen" ]]; then
elif [[ "${step}" = "esnowanl" ]]; then

export NTHREADS_ESNOWRECEN=${NTHREADSmax}
export APRUN_ESNOWRECEN="${APRUN_default} --cpus-per-task=${NTHREADS_ESNOWRECEN}"
export APRUN_CALCFIMS="${launcher} -n 1"

export NTHREADS_ESNOWANL=${NTHREADSmax}
export APRUN_ESNOWANL="${APRUN_default} --cpus-per-task=${NTHREADS_ESNOWANL}"

export APRUN_APPLY_INCR="${launcher} -n 6"

Expand Down
8 changes: 5 additions & 3 deletions env/ORION.env
Original file line number Diff line number Diff line change
Expand Up @@ -109,10 +109,12 @@ elif [[ "${step}" = "snowanl" ]]; then

export APRUN_APPLY_INCR="${launcher} -n 6"

elif [[ "${step}" = "esnowrecen" ]]; then
elif [[ "${step}" = "esnowanl" ]]; then

export NTHREADS_ESNOWRECEN=${NTHREADSmax}
export APRUN_ESNOWRECEN="${APRUN_default} --cpus-per-task=${NTHREADS_ESNOWRECEN}"
export APRUN_CALCFIMS="${launcher} -n 1"

export NTHREADS_ESNOWANL=${NTHREADSmax}
export APRUN_ESNOWANL="${APRUN_default} --cpus-per-task=${NTHREADS_ESNOWANL}"

export APRUN_APPLY_INCR="${launcher} -n 6"

Expand Down
8 changes: 5 additions & 3 deletions env/S4.env
Original file line number Diff line number Diff line change
Expand Up @@ -102,10 +102,12 @@ elif [[ "${step}" = "snowanl" ]]; then

export APRUN_APPLY_INCR="${launcher} -n 6"

elif [[ "${step}" = "esnowrecen" ]]; then
elif [[ "${step}" = "esnowanl" ]]; then

export NTHREADS_ESNOWRECEN=${NTHREADSmax}
export APRUN_ESNOWRECEN="${APRUN_default} --cpus-per-task=${NTHREADS_ESNOWRECEN}"
export APRUN_CALCFIMS="${launcher} -n 1"

export NTHREADS_ESNOWANL=${NTHREADSmax}
export APRUN_ESNOWANL="${APRUN_default} --cpus-per-task=${NTHREADS_ESNOWANL}"

export APRUN_APPLY_INCR="${launcher} -n 6"

Expand Down
8 changes: 5 additions & 3 deletions env/WCOSS2.env
Original file line number Diff line number Diff line change
Expand Up @@ -95,10 +95,12 @@ elif [[ "${step}" = "snowanl" ]]; then

export APRUN_APPLY_INCR="${launcher} -n 6"

elif [[ "${step}" = "esnowrecen" ]]; then
elif [[ "${step}" = "esnowanl" ]]; then

export NTHREADS_ESNOWRECEN=${NTHREADSmax}
export APRUN_ESNOWRECEN="${APRUN_default}"
export APRUN_CALCFIMS="${launcher} -n 1"

export NTHREADS_ESNOWANL=${NTHREADSmax}
export APRUN_ESNOWANL="${APRUN_default}"

export APRUN_APPLY_INCR="${launcher} -n 6"

Expand Down
3 changes: 2 additions & 1 deletion jobs/JGDAS_ENKF_ARCHIVE
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,8 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "earc" -c "base earc"
YMD=${PDY} HH=${cyc} declare_from_tmpl -rx COM_TOP
MEMDIR="ensstat" YMD=${PDY} HH=${cyc} declare_from_tmpl -rx \
COMIN_ATMOS_ANALYSIS_ENSSTAT:COM_ATMOS_ANALYSIS_TMPL \
COMIN_ATMOS_HISTORY_ENSSTAT:COM_ATMOS_HISTORY_TMPL
COMIN_ATMOS_HISTORY_ENSSTAT:COM_ATMOS_HISTORY_TMPL \
COMIN_SNOW_ANALYSIS_ENSSTAT:COM_SNOW_ANALYSIS_TMPL

###############################################################
# Run archive script
Expand Down
18 changes: 11 additions & 7 deletions jobs/JGDAS_ENKF_SNOW_RECENTER → jobs/JGLOBAL_SNOWENS_ANALYSIS
Original file line number Diff line number Diff line change
@@ -1,7 +1,7 @@
#! /usr/bin/env bash

source "${HOMEgfs}/ush/preamble.sh"
source "${HOMEgfs}/ush/jjob_header.sh" -e "esnowrecen" -c "base esnowrecen"
source "${HOMEgfs}/ush/jjob_header.sh" -e "esnowanl" -c "base esnowanl"

##############################################
# Set variables used in the script
Expand All @@ -10,19 +10,18 @@ source "${HOMEgfs}/ush/jjob_header.sh" -e "esnowrecen" -c "base esnowrecen"
# shellcheck disable=SC2153
GDUMP="gdas"
export GDUMP
CDUMP=${RUN/enkf}
export CDUMP

##############################################
# Begin JOB SPECIFIC work
##############################################
# Generate COM variables from templates
RUN=${CDUMP} YMD=${PDY} HH=${cyc} declare_from_tmpl -rx \
COMIN_OBS:COM_OBS_TMPL
YMD=${PDY} HH=${cyc} declare_from_tmpl -rx \
COMIN_OBS:COM_OBS_TMPL \
COMOUT_ATMOS_ANALYSIS:COM_ATMOS_ANALYSIS_TMPL \
COMOUT_CONF:COM_CONF_TMPL
MEMDIR="ensstat" YMD=${PDY} HH=${cyc} declare_from_tmpl \
COMOUT_SNOW_ANALYSIS:COM_SNOW_ANALYSIS_TMPL

mkdir -p "${COMOUT_SNOW_ANALYSIS}" "${COMOUT_CONF}"

for imem in $(seq 1 "${NMEM_ENS}"); do
memchar="mem$(printf %03i "${imem}")"
Expand All @@ -31,10 +30,15 @@ for imem in $(seq 1 "${NMEM_ENS}"); do
mkdir -p "${COMOUT_SNOW_ANALYSIS}"
done

MEMDIR="ensstat" YMD=${PDY} HH=${cyc} declare_from_tmpl -x\
COMOUT_SNOW_ANALYSIS:COM_SNOW_ANALYSIS_TMPL

mkdir -p "${COMOUT_SNOW_ANALYSIS}" "${COMOUT_CONF}"

###############################################################
# Run relevant script

EXSCRIPT=${SNOWANLPY:-${SCRgfs}/exgdas_enkf_snow_recenter.py}
EXSCRIPT=${SNOWANLPY:-${SCRgfs}/exglobal_snowens_analysis.py}
${EXSCRIPT}
status=$?
(( status != 0 )) && exit "${status}"
Expand Down
10 changes: 6 additions & 4 deletions jobs/JGLOBAL_SNOW_ANALYSIS
Original file line number Diff line number Diff line change
@@ -1,7 +1,6 @@
#! /usr/bin/env bash

source "${HOMEgfs}/ush/preamble.sh"
export DATA=${DATA:-${DATAROOT}/${RUN}snowanl_${cyc}}
source "${HOMEgfs}/ush/jjob_header.sh" -e "snowanl" -c "base snowanl"

##############################################
Expand All @@ -18,12 +17,15 @@ GDUMP="gdas"
# Begin JOB SPECIFIC work
##############################################
# Generate COM variables from templates
YMD=${PDY} HH=${cyc} declare_from_tmpl -rx COM_OBS COM_SNOW_ANALYSIS COM_CONF
YMD=${PDY} HH=${cyc} declare_from_tmpl -rx \
COMIN_OBS:COM_OBS_TMPL \
COMOUT_SNOW_ANALYSIS:COM_SNOW_ANALYSIS_TMPL \
COMOUT_CONF:COM_CONF_TMPL

RUN=${GDUMP} YMD=${gPDY} HH=${gcyc} declare_from_tmpl -rx \
COM_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL
COMIN_ATMOS_RESTART_PREV:COM_ATMOS_RESTART_TMPL

mkdir -m 775 -p "${COM_SNOW_ANALYSIS}" "${COM_CONF}"
mkdir -m 775 -p "${COMOUT_SNOW_ANALYSIS}" "${COMOUT_CONF}"

###############################################################
# Run relevant script
Expand Down
26 changes: 26 additions & 0 deletions jobs/rocoto/esnowanl.sh
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
#! /usr/bin/env bash

source "${HOMEgfs}/ush/preamble.sh"

###############################################################
# Source UFSDA workflow modules
. "${HOMEgfs}/ush/load_ufsda_modules.sh"
status=$?
[[ ${status} -ne 0 ]] && exit "${status}"

export job="esnowanl"
export jobid="${job}.$$"

###############################################################
# setup python path for ioda utilities
# shellcheck disable=SC2311
pyiodaPATH="${HOMEgfs}/sorc/gdas.cd/build/lib/python$(detect_py_ver)/"
gdasappPATH="${HOMEgfs}/sorc/gdas.cd/sorc/iodaconv/src:${pyiodaPATH}"
PYTHONPATH="${PYTHONPATH:+${PYTHONPATH}:}:${gdasappPATH}"
export PYTHONPATH

###############################################################
# Execute the JJOB
"${HOMEgfs}/jobs/JGLOBAL_SNOWENS_ANALYSIS"
status=$?
exit "${status}"
18 changes: 0 additions & 18 deletions jobs/rocoto/esnowrecen.sh

This file was deleted.

Loading

0 comments on commit 917b792

Please sign in to comment.