From 350193c81fb075427906c3305439dddca017b087 Mon Sep 17 00:00:00 2001 From: "Judy.K.Henderson" Date: Fri, 20 Mar 2020 22:28:55 +0000 Subject: [PATCH] merged 19Mar feature/gfsv16b branch into gmtb_v16beta Squashed commit of the following: commit 586d0994aab2348c1deea521b37183ddf0a8f2d7 Author: kate.friedman Date: Thu Mar 19 14:01:16 2020 +0000 Issue #1 - update EMC_verif-global to tag verif_global_v1.6.0 in Externals.cfg commit f412f2ca8071baa6bd91f641c1fd8102e9271bef Author: russ.treadon Date: Thu Mar 19 13:53:34 2020 +0000 Issue #1: update EMC_verif-global to tag verif_global_v1.6.0 commit 94abe6a1caca4724fbc580d58928a6cf74a65590 Author: kate.friedman Date: Wed Mar 18 19:02:09 2020 +0000 Fix syntax error in setup_workflow_fcstonly.py commit 06ff92be074bac2728055a62ce0fa263a4d13cf6 Author: Guang Ping Lou Date: Tue Mar 17 18:10:37 2020 +0000 modify job card for tests commit a2b76f1d402c8568612dc727ebf36b548748589d Author: Guang Ping Lou Date: Tue Mar 17 18:10:04 2020 +0000 modify namsnd TIMSTN commit 81bc3fc4090c4fd3bdec070abd9c117d5a09491d Author: Guang Ping Lou Date: Tue Mar 17 18:09:24 2020 +0000 modify collectives mpmd design commit 9cbf9be8ea2a959a6711c10838ba46347451b7d1 Author: Guang Ping Lou Date: Tue Mar 17 18:08:41 2020 +0000 modify config.resources for postsnd number of tasks commit e3945e03031db23dbb8681e3d1034cc07377ae2d Author: russ.treadon Date: Mon Mar 16 10:38:07 2020 +0000 Issue #1: add logic to check WAVE_CDUMP prior to setting npe_fcst_gfs commit 6fdc6365fa13c77f01ae0cdb82e74bbbd149f914 Author: russ.treadon Date: Mon Mar 16 10:35:14 2020 +0000 Issue #1: DA workflow updates for GFS v16 (does not change results) commit 91610f80cef8911ac6aaae2ded67a95b4ff1d061 Author: russ.treadon Date: Wed Mar 11 17:25:36 2020 +0000 Issue #1: update Externals.cfg to GFS v16 components commit 7dcf23281cddb9aabc20c73ec077a9a52ded56b8 Merge: 55130516 057b2a82 Author: russ.treadon Date: Wed Mar 11 17:21:33 2020 +0000 Issue #1: Merge branch 'develop' at 057b2a8 into feature/gfsv16b commit 057b2a82fa43f7bc36c3e757ca7d48fb86b9541c Merge: 0377d20f 622167d5 Author: Kate Friedman Date: Wed Mar 11 12:00:58 2020 -0400 Merge pull request #29 from NOAA-EMC/feature/manage_externals Issue #3 - Introduce manage_externals as replacement for checkout.sh commit 55130516cea5ea67b758b52c9445aca667e1c35a Author: kate.friedman Date: Mon Mar 9 13:59:44 2020 +0000 Add ability to have wave jobs depending on cdump commit 622167d5fb3322921a1702639ebccb42da1f5e1b Author: kate.friedman Date: Fri Mar 6 18:20:31 2020 +0000 Issue #3 - added explicit config flag example for checkout_externals in README and blurb about this replacing checkout.sh commit e83b90d50999f64ed5208d94a8cceb8179c9395f Author: kate.friedman Date: Fri Mar 6 17:00:15 2020 +0000 Issue #3 - remove prod_util and grib_util sections from build_all.sh, removed elsewhere already commit 8699b46aa1797e8dd29edff1d4bd3b511ad5cb1c Author: kate.friedman Date: Fri Mar 6 16:30:54 2020 +0000 Issue #3 - updated README with new manic version commit e602cd3d536b55b86b04285aedd5781d8d3a9f82 Author: kate.friedman Date: Fri Mar 6 16:27:57 2020 +0000 Issue #3 - updated link_fv3gfs.sh to adjust wafs links commit 830c73f430d70cc516dea419a8969c6fd9fc0910 Author: kate.friedman Date: Fri Mar 6 15:21:45 2020 +0000 Issue #3 - update EMC_verif-global tag in Externals.cfg after sync with develop commit 40084e67810d21366d0e9af6d9937c29aa4965ad Merge: f662fffa 0377d20f Author: kate.friedman Date: Fri Mar 6 15:18:52 2020 +0000 Issue #3 - Merge branch 'develop' into feature/manage_externals commit cfa5be4b414f67efa8eb17c8fe5060fe94cd48ea Author: kate.friedman Date: Fri Mar 6 14:05:29 2020 +0000 Issue #1 - remove gfswave restart archival commit 85b3a1d24e89c60e9977ecc549656a0145caeb8b Author: russ.treadon Date: Thu Mar 5 18:51:22 2020 +0000 Issue #1: update postsnd section of config.resources commit 6c8dc92953e5c003882ffbb8376a8b1b2c8668d2 Author: russ.treadon Date: Thu Mar 5 18:40:38 2020 +0000 Issue #1: update config files to be consistent with settings in pre-implementation GFS v16 real-time and retrospective parallels * parm/config/config.anal - reduce second outer loop iteration count for gfs * parm/config/config.epos - add ENKF_SPREAD to toggle on/YES or off/NO generation of ensemble spread files * parm/config/config.fcst - set adjust_dry_mass for GDAS (true) and GFS (false) * parm/config/config.fv3 - remove nth_efcs, set C384 nth_fv3=1 * parm/config/config.resources - set nth_efcs to nth_fv3 (default 2) commit a107109d8fdd0292a3017f26e56863fbb28a63ce Author: kate.friedman Date: Thu Mar 5 16:08:23 2020 +0000 Fixed bug in config.fcst related to extra then commit 0de559a088ac670f2b105168988c18ee1b57488a Author: russ.treadon Date: Thu Mar 5 15:55:50 2020 +0000 Issue #1: update exglobal_fcst_nemsfv3gfs.sh and checkout.sh * scripts/exglobal_fcst_nemsfv3gfs.sh - turn on adjust_dry_mass with dry_mass=98320.0, turn off iau_drymassfixer * sorc/checkout.sh - update to model tag GFS.v16.0.1 commit ce23e52730a9f6fe20215cb47563dd6f2dcb254a Merge: fa7bca5a 0377d20f Author: russ.treadon Date: Thu Mar 5 15:36:50 2020 +0000 Issue #1: Merge branch 'develop' at 0377d20 into feature/gfsv16b commit 0377d20f3d019f77a47fc9860d6146fd3c8e5d94 Merge: 1b359dbe 25524675 Author: Kate Friedman Date: Thu Mar 5 08:43:16 2020 -0500 Merge pull request #28 from NOAA-EMC/feature/metplus2 Issue #8 - add switch for MET+ jobs commit 25524675a63e59829655bbd9a09abc4dca246357 Author: kate.friedman Date: Thu Mar 5 13:31:02 2020 +0000 Issue #8 - add switch for MET+ jobs commit fa7bca5a5fb26886219190e9e5e5d6efb9f9ddab Merge: af73a39f cd67f975 Author: Kate Friedman Date: Thu Mar 5 08:08:57 2020 -0500 Merge pull request #25 from NOAA-EMC/feature/wave2global Feature/wave2global into feature/gfsv16b commit cd67f97592227fea0238327dade887d4584300c3 Merge: c72d7b5b af73a39f Author: kate.friedman Date: Thu Mar 5 13:03:10 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit c72d7b5bca4a67e83916edd17fb0e5c4612a7867 Author: Jose-Henrique Alves <47567389+ajhenrique@users.noreply.github.com> Date: Wed Mar 4 23:53:41 2020 -0500 Clean up exwave_post_sbs commit 2e738f20930dfd7d5216920ec7cc26774be811f3 Author: Henrique Alves Date: Thu Mar 5 04:03:44 2020 +0000 Moving standalone fv3 model_config exglobal_fcst block into if/else/fi cplwav model_config block. Reinstating config.wave block in JGLOBAL_FORECAST. Pointing EXECwave to HOMEgfs/exec directory for WW3 util executables (changed link_fv3gfs.sh accordingly). Removing debug options from compile.sh line in build_fv3.sh. commit b7638436b1e737077fbb2dad705e7ed157df261e Author: kate.friedman Date: Wed Mar 4 21:02:04 2020 +0000 Fix to JWAVE_PREP to look back a day for rtofs commit 33cf0fc0f2a5d3b3ca749c9941835398f94e7606 Author: kate.friedman Date: Wed Mar 4 19:59:56 2020 +0000 Adjustments after going through PR review commit 1b359dbeb31b94382619dfc9c67e77fffe46aaa0 Merge: 0359d342 31bb7d32 Author: Kate Friedman Date: Wed Mar 4 10:19:36 2020 -0500 Merge pull request #26 from NOAA-EMC/feature/metplus Feature/metplus - refactored MET+ jobs to resolve timing issues commit 5c8fa3dd2666b7fd90b473fb12629016ef402e3e Author: kate.friedman Date: Wed Mar 4 13:03:12 2020 +0000 Adjust restart_interval if-blocks for DOIAU=YES in configs commit 2b337a8f8c900146905915032826faf9e6f07dc0 Merge: a4daafee e0a1a0ae Author: Henrique Alves Date: Tue Mar 3 18:31:09 2020 +0000 Merge branch 'feature/wave2global' of github.com:NOAA-EMC/global-workflow into feature/wave2global commit a4daafeecaf3fed38e637389ab5f4041b5139a4c Author: Henrique Alves Date: Tue Mar 3 18:30:58 2020 +0000 Adjustment to waveprep resources to match numbger of rtofs files, bugfix on generation of ascii spectral data commit e0a1a0aeb4066ab834fd93a86780196f55a7a447 Author: kate.friedman Date: Mon Mar 2 18:45:50 2020 +0000 Fix gdasfcst dep on gldas commit 817f8faa6b0c71ef970da302d6e46b57c9ea8129 Merge: 48ed4c5f 203d1747 Author: kate.friedman Date: Mon Mar 2 13:41:13 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit 48ed4c5f4a6b0093941a97d475abdb6bed2937e5 Author: Henrique Alves Date: Mon Mar 2 04:22:02 2020 +0000 General cleanup. wavepostsbs wall time limit matches gfsfcst walltime . commit 4af83a430a8bb58f83d686535d39d9745de9ab6c Author: Henrique Alves Date: Sun Mar 1 02:51:02 2020 +0000 Cleaning up prior to merging to gfsv16b commit 14f52dd35b7d6a1a9b98cff0da2426015cfabd3d Merge: 4400d7c3 d39ba74d Author: Henrique Alves Date: Sat Feb 29 18:23:17 2020 +0000 Merge branch 'feature/wave2global' of github.com:NOAA-EMC/global-workflow into feature/wave2global commit 4400d7c3e5e26bacc0ba892ef3178d4d906d57a8 Author: Henrique Alves Date: Sat Feb 29 18:23:13 2020 +0000 Changes to feal with waveprep and wavepost commit d39ba74d71b5bd5c510eb32cac17facdbe5a9a49 Author: kate.friedman Date: Fri Feb 28 19:01:02 2020 +0000 Removed extra forward slashes in wave part of hpssarch_gen.sh commit 82889365da44196804b7d751a1c8c5992ca102ea Author: kate.friedman Date: Fri Feb 28 18:44:13 2020 +0000 Added waves to archival commit a8f4200d04d7c49a1ff1cd443f1f36fd949cb5db Merge: 9321bdc9 8997f2a2 Author: kate.friedman Date: Fri Feb 28 13:58:28 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit 9321bdc9a5a85a4b458b3539bb4f15bdcbf2cf33 Author: kate.friedman Date: Fri Feb 28 13:39:34 2020 +0000 Cleanup and added adjustments to other env files commit 67e20de5ac6e8f52312bc71adc5419330ebe6542 Author: Henrique Alves Date: Thu Feb 27 04:22:36 2020 +0000 Adding current smoothing flag to config.wave. Previous push alos contained changes to reflect improved wave model physics. commit 10164bdae0fdc53a604dc8e58b2a6a5fce5ebc99 Author: Henrique Alves Date: Thu Feb 27 03:47:36 2020 +0000 Adjustments to use gfs/gdas seaice file instead of omb file. Changes to intake rtofs 1hr files when available. commit 7145fd01ffcf3043dd1dea458ae866f208f32633 Author: kate.friedman Date: Wed Feb 26 19:34:24 2020 +0000 Modify prep.sh to create rtofs symlink in ROTDIR if DO_WAVE=YES commit 5ac8c8683b6f3de8cc586bae8d3825b2bd67011c Author: kate.friedman Date: Wed Feb 26 18:44:45 2020 +0000 Fix for missing nth_postsnd in config.resources commit 011a393c162a422859b1184e9f32dd50d1d48882 Merge: d01c3543 8806c9e4 Author: Kate.Friedman Date: Wed Feb 26 13:50:49 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit d01c3543e59bbabb491454788f17e818305de6fc Author: kate.friedman Date: Wed Feb 26 13:48:26 2020 +0000 Updates to checkout.sh from feature/gfsv16b commit 3b5d451aa165dc368c674cd2a71d6786097869e7 Merge: 88d8abda 318d8b46 Author: kate.friedman Date: Wed Feb 26 13:43:52 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit 88d8abda4adf3c945b1c5526dfca6b1774b91f3b Merge: 33c65a19 9d6e8464 Author: kate.friedman Date: Tue Feb 25 17:49:29 2020 +0000 Sync merge with feature/gfsv16b commit 33c65a19cbe68342cd9ab3f434a22a399be3a1b4 Merge: 9a1f79d7 0aa8dacf Author: Henrique Alves Date: Mon Feb 24 15:37:48 2020 +0000 Bugfix for creating spectra files in wavepostsbs step commit 0aa8dacfeef3fd1bf3abe368c9cdaf9a16b57cde Merge: 8148766c 431c7866 Author: kate.friedman Date: Fri Feb 21 13:57:55 2020 +0000 Merge branch 'feature/gfsv16b' into feature/wave2global commit 8148766c489057553e20fc42bad494f1ed5aa70c Author: kate.friedman Date: Fri Feb 21 13:56:56 2020 +0000 Small fix to exglobal_fcst_nemsfv3gfs.sh commit ebf02b19f98f90ae3c8bc57cc5c2325260aaab0a Author: kate.friedman Date: Thu Feb 20 14:05:52 2020 +0000 Remove output_1st_tstep_rst override and add iau_drymassfixer commit 36cb42dc5d8985f8402405dce811f04063f572ab Author: Henrique Alves Date: Thu Feb 20 12:30:28 2020 +0000 Removing references to gens/gefs from exglobal_fcst_fcst_nemsfv3gfs.sh in preparation for merge back to gfsv16b commit 9a1f79d7aa0343ea95d1af419f55a2d0d356e99d Author: Henrique Alves Date: Thu Feb 20 12:19:11 2020 +0000 Adding back lost and found EOF in section that creates input.nml commit 635d95d58c932909dc6a29c62515c385dc01587a Merge: f31de5aa c4454b3a Author: kate.friedman Date: Wed Feb 19 20:42:49 2020 +0000 Additional sync merge with feature/gfsv16b today commit f31de5aaa6c915ac8bc1184726bead71dc9157d4 Author: kate.friedman Date: Wed Feb 19 20:18:23 2020 +0000 Removing copies of exglobal_fcst that were added erroneously commit 31bb7d32181ca84229c3c3374226bbd37784ddc4 Merge: eb73e520 0359d342 Author: Mallory Row Date: Wed Feb 19 15:24:42 2020 +0000 Merge branch 'develop' into feature/metplus commit e058a9054d7edb2b92ee590b6dede733fe4bc4a3 Author: kate.friedman Date: Wed Feb 19 15:05:59 2020 +0000 Changes to config.resources after wave tests commit 24844e403745fb5b292fcde0aae53a3c497aa295 Merge: 0f2f2b0b c2e99795 Author: kate.friedman Date: Wed Feb 19 14:03:58 2020 +0000 Sync merge with feature/gfsv16b commit 0f2f2b0be1ff97485a23cdaf214d428927290c3d Author: kate.friedman Date: Wed Feb 19 13:36:14 2020 +0000 Updates to config.wave and scripts/exwave_post_sbs.sh commit 958ee38b6fb9c76056a3045f43dec92a503f48c9 Author: kate.friedman Date: Fri Feb 14 20:38:53 2020 +0000 Increasing resources for eupd when C384 commit f662fffa25a99617828e4322bf789978cf523248 Author: kate.friedman Date: Fri Feb 14 15:57:05 2020 +0000 Issue #3 - Updated README with new manic tag v1.1.7 commit 94cd971fb14c4b7204822ca780d6e9dbe945b595 Author: kate.friedman Date: Fri Feb 14 15:31:32 2020 +0000 Updates to wave scripts commit e3196a84a0ecd2b54d59abfdc9184622a9c605ca Author: Kate Friedman Date: Thu Feb 13 15:59:06 2020 -0500 Update README.md commit e46b175d8a309010e421ccd54e6d6eb083af3579 Merge: 4bd0e203 0359d342 Author: Kate.Friedman Date: Thu Feb 13 20:38:04 2020 +0000 Issue #3 - sync merge with develop branch commit aec4c288e1f96ee7f3f2a834014edde0ef4ea9a2 Author: Kate.Friedman Date: Thu Feb 13 19:09:56 2020 +0000 Added EUPD_CYC variable to config.base.emc.dyn commit 2a2a8d3fe1fcd0bebf07d34c32697f64f5ac745c Author: Kate.Friedman Date: Wed Feb 12 18:00:14 2020 +0000 Adjusting config.resources for C384 commit 8bdf2387b1fad22cfa2da37a7428406a847d2362 Author: Kate.Friedman Date: Wed Feb 12 17:33:53 2020 +0000 Reducing npe_eobs from 400 to 100 commit c48ac6df8c5c166a1ef51b173372f3f26f9d877a Author: Kate.Friedman Date: Wed Feb 12 14:46:26 2020 +0000 Removing gefs from checkout.sh and link_fv3gfs.sh commit 78a7294eb1937758fa81f72cc5045d3d680a07c6 Author: kate.friedman Date: Mon Feb 10 19:29:15 2020 +0000 Fixed missing fi in checkout.sh commit 15628bda0621187949204d0d9af4d901e205495d Author: kate.friedman Date: Mon Feb 10 19:21:39 2020 +0000 Change fv3gfs checkout to develop branch of ufs-weather-model commit 631820beca07522faedccc47d7415c4b6b250273 Author: kate.friedman Date: Mon Feb 10 19:08:30 2020 +0000 Changing checkout.sh to Henrique's fork branch gfsv16_wave commit a518485866536102312c477926ae42a59d5e561e Author: kate.friedman Date: Mon Feb 10 17:02:22 2020 +0000 Changing fv3gfs.fd checkout back to gfsv16_updates commit c75d35b6802592b17640fecace38d9103a14b1bc Merge: 2a46ee2e ba63c4f8 Author: kate.friedman Date: Mon Feb 10 16:44:53 2020 +0000 Sync merge with feature/gfsv16b commit 2a46ee2ee71da5c75d11cbc96b048a1e34148239 Author: kate.friedman Date: Mon Feb 10 15:42:35 2020 +0000 Updates for wave and fcst scripts, as well as config.wave commit eb73e520716215c3f11cc4cdfce3831408221766 Author: Mallory Row Date: Fri Feb 7 14:04:37 2020 +0000 Update EMC_verif-global checkout to verif_global_v1.5.0 commit 85deaf78b86184f9b52afe5d68b370ba640c152b Author: kate.friedman Date: Thu Feb 6 19:16:18 2020 +0000 Updated resource configs based on C384 and C768 tests commit 4bd0e20300cc2a79e79433b2ec8cdb15c8f01c9e Author: Kate Friedman Date: Thu Feb 6 11:55:31 2020 -0500 Update README.md commit d9ea1acab54f65a987b32d56587dfd1b6bcd037c Author: Kate.Friedman Date: Thu Feb 6 16:03:11 2020 +0000 Issue #3 - reduce hashes down to minimum 8 characters commit 17bcdf2a1e0e8f1febfc6d57994a12ac625d9d75 Author: kate.friedman Date: Tue Feb 4 19:11:42 2020 +0000 Wave changes for running with IAU on commit 50b70215096dbc87e0484ed0eea9572238c37432 Author: kate.friedman Date: Thu Jan 30 14:55:48 2020 +0000 Resource adjustments and fix for restart copying in exglobal commit 85b254d8e65318adaeef87ae7187218de59f94c5 Author: Henrique Alves Date: Tue Jan 28 21:18:35 2020 +0000 Adjusting resources for wave component commit 3f858be34da7134a071762376f4c4e69ea73f4a0 Author: kate.friedman Date: Tue Jan 28 20:43:02 2020 +0000 Further adjustment to config.efcs for restart_interval for cold start SDATE commit a674d30c8fe1da53c2d8bda5b695a474e8906f6b Merge: 9944f2fe 3fda3e35 Author: kate.friedman Date: Tue Jan 28 20:26:10 2020 +0000 Merge branch 'feature/wave2global' of https://github.com/NOAA-EMC/global-workflow into feature/wave2global commit 9944f2fe15576f0b401996fefe3c910888708e0d Author: kate.friedman Date: Tue Jan 28 20:25:59 2020 +0000 Resource adjustments for C384, restart_interval adjustment for cold start, and fcst dependency update commit b64fd5ff43f88bd5f2d27b0a16fb803eac8aff8c Merge: cf008631 3ed9267b Author: kate.friedman Date: Tue Jan 28 15:24:37 2020 +0000 Merge branch 'develop' into feature/manage_externals commit cf0086311daaf62ee33df010946d0a0ddc5bc400 Author: kate.friedman Date: Tue Jan 28 15:23:14 2020 +0000 Issue #3 - remove copy of manage_externals under util and add README.md file commit 3fda3e359f66611c77a97ca9c30a6204a36077c3 Author: Henrique Alves Date: Tue Jan 28 14:18:47 2020 +0000 Adjusting output stride on station outputs, adding bulletin output to wavepostsbs using unified wave_outp_spec.sh (removing wave_outp_bull.sh), adjusting wave resources in config.fv3 commit e4bd7d09ef3cc06d14dbfcb07bd3f211d4a5e54c Merge: 99b19f18 b3d88593 Author: kate.friedman Date: Tue Jan 28 13:58:16 2020 +0000 Sync merge with feature/gfsv16b commit 99b19f18e0a01fcf49b442ec66f1aea43b16433c Author: kate.friedman Date: Tue Jan 28 13:54:46 2020 +0000 Disabling additional wave jobs for later implementation commit bfc7bb0b237d4cf4240551aa8b9c5d025724ac16 Author: Kate.Friedman Date: Mon Jan 27 19:06:17 2020 +0000 Issue #3 - initial add of manage_externals and needed Externals.cfg. Also added .gitignore file and removed scripts/files associated with no-longer-used prod_util and grid_util builds. commit e216205aec6f481b4e96c8707d485dd2eacfb0e7 Author: Henrique Alves Date: Mon Jan 27 00:01:14 2020 +0000 Updating resource config in config.fv3 for running cplwav in c384 with ppn=7,thrd=4; checkout.sh now points to the latest ufs-weather-model develop branch; updated fv3gfs_build.cfg to match ufs-weather-model commit 786806f3cd5d858615f3f74dec78891a2189ee79 Author: Mallory Row Date: Fri Jan 24 15:04:16 2020 +0000 Missed file format updates in a few places in config.metp commit c11dfef0f7fb8da2866e6a022ac0ff60044766a7 Author: Mallory Row Date: Fri Jan 24 14:07:29 2020 +0000 Update EMC_verif-global tag to verif_global_v1.4.1 commit df2e6b0bd9a2af2107a78c12f9c648a61fe0d5b9 Merge: bc07de02 09e68b48 Author: kate.friedman Date: Thu Jan 23 20:00:34 2020 +0000 Sync merge with feature/gfsv16b commit bc07de02af5743194cb9203d0d36e62e9a25b223 Author: kate.friedman Date: Thu Jan 23 19:43:12 2020 +0000 Fixed wavepostsbs dependencies commit 4b54c37b425c947e79c4ddc192a6e2f06b8528fa Author: kate.friedman Date: Thu Jan 23 15:04:20 2020 +0000 Turn off cplwav in efcs config and adjusted dependencies for fcst job commit 0ea809c208ce606c957eed4d346a3828d8186010 Author: Mallory Row Date: Thu Jan 23 13:29:56 2020 +0000 Update file format variable in config.metp of online archive files commit b886c848a60871a86bb0b45d024a34368ad1d898 Author: kate.friedman Date: Wed Jan 22 21:01:34 2020 +0000 Change to base config and update to wave dependencies commit 154b118a7808bfa60abe0620e4bee21cb4faabee Author: Henrique Alves Date: Wed Jan 22 03:37:24 2020 +0000 Changing WAV_MOD_ID to MDC (model component) tag; updating some paths for wave components in several scripts. Correcting wave_tar bug. commit 0edcb211586f1f96d01f172215568fa7eee3a7a2 Author: Henrique Alves Date: Tue Jan 21 20:15:14 2020 +0000 Several changes to change the sirectory name from to etc. Updates to wavepostsbs. commit c0d7179f34837e40db9ccb1b941ad9f312283a6e Author: Mallory Row Date: Tue Jan 21 16:37:16 2020 +0000 Add updated env machine files for gfsmetp commit 82e690717d72c7b021c637270108f4bacfb6816d Author: Mallory Row Date: Tue Jan 21 16:29:23 2020 +0000 Update config.resources for gfsmetp commit 72e8adf1c8a8859786cbbcf1b976640d73c5c867 Author: Mallory Row Date: Tue Jan 21 16:19:31 2020 +0000 Update EMC_verif-global tag checkout to 1.4.0 commit 6872f79f3f9052377ff863da1bbac482548ee0ce Author: Mallory Row Date: Tue Jan 21 16:14:25 2020 +0000 Add rocoto METplus job script commit 9c94156670bd810561bd2a699648afb946511ea9 Author: Mallory Row Date: Tue Jan 21 16:09:13 2020 +0000 Changes to setup_workflow.py for gfsmetp metatask commit 0ad851882686087fdd21a4a8f88b65dd3960cd1c Author: kate.friedman Date: Fri Jan 17 17:29:09 2020 +0000 ACCOUNT fix in config.base.emc.dyn and dependency fix to setup_workflow.py commit 80f13fb4b785fbde8e925cfb6deb547297f34e70 Author: Henrique Alves Date: Fri Jan 17 03:29:35 2020 +0000 Removing underscore from COM wave directory names commit 39a9df9385631df4a3cba4e6663fc6ab5c3f238d Author: Henrique Alves Date: Fri Jan 17 03:08:41 2020 +0000 Changing back waveprep to include ice and currents by default commit c36df4d748bbc2a8dc7b4a044a1f62f637e3ba33 Author: Henrique Alves Date: Thu Jan 16 19:43:26 2020 +0000 Updating post sbs script to copy station files to correct directory. commit f1cd7ab43d24fc576c82ea227cd7d79cc08f4835 Author: Henrique Alves Date: Wed Jan 15 20:48:11 2020 +0000 Adding block sourcing config files into wave j-jobs commit b736f8315497bfc29fd77faa9d98bf558ddea919 Author: kate.friedman Date: Wed Jan 15 20:40:16 2020 +0000 Removed config sourcing from rocoto job scripts commit 4f6840b25d85152ea7393a8959619bb4caab9d67 Author: Henrique Alves Date: Wed Jan 15 20:38:41 2020 +0000 Removing dependency on log file for wave post sbs commit 429c409799f1999babbfb3653e6f03be66f8fec3 Merge: 3f685fd3 a6905174 Author: Henrique Alves Date: Wed Jan 15 20:35:24 2020 +0000 Merge branch 'feature/wave2global' of github.com:NOAA-EMC/global-workflow into feature/wave2global commit 3f685fd3423add0ba2cfb7c3115a674891c44e55 Author: Henrique Alves Date: Wed Jan 15 20:29:58 2020 +0000 Reinstating Ratkos NO NO in build_fv3.sh commit a69051742c5edd0e4d34d178329fc24cacede78a Author: kate.friedman Date: Wed Jan 15 20:12:20 2020 +0000 Adding supplemental config source to JGLOBAL_FORECAST for wave commit 1f5c593705df384608b1f1dd84df9114d5d8a812 Author: Henrique Alves Date: Wed Jan 15 19:59:27 2020 +0000 Removing KEEPDATA from JWAVE_POST_SBS commit baa06afcecef64faad51bfac3b4fd2c064df3fb5 Author: Henrique Alves Date: Wed Jan 15 19:32:06 2020 +0000 Changes to update sbs post and reconciling parameters for output post data. commit 1915aa921dfe2ef17799599e5a3084547caa3ca2 Author: kate.friedman Date: Fri Jan 10 18:36:32 2020 +0000 Issue #8 - pulled in config.metp and modifications to two setup scripts commit 784fb65dee27135340771b1c778d1d69219ba05c Merge: 9562fc71 7440aad1 Author: Henrique Alves Date: Fri Dec 27 05:01:12 2019 +0000 Merging latest feature/gfsv16b branch into wave2global commit 9562fc71eb80e33f4d132dd9ad9ca5903be1435f Author: Henrique Alves Date: Fri Dec 27 04:47:20 2019 +0000 Adjusting resources and env for running C384 coupled commit 5ce0bdf026c4c9a4a62252a264816710558dfd20 Author: Henrique Alves Date: Thu Dec 26 19:00:42 2019 +0000 Changes to support wave side by side post commit 07d9ec5aa1e54e9b78ee1e18b0fa0d984f129bea Author: Henrique Alves Date: Sun Dec 22 17:55:13 2019 +0000 Temporary changes to work with updated WW3 code under GEFS_v12 and to bugfix wave_post_sbs. commit a69df4b414f5247a9db24ddebb42bc409f26d090 Author: Henrique Alves Date: Fri Dec 20 12:58:02 2019 +0000 Additional changes to have wave init, prep, postsbs handed over for experiments by gfsv16 group commit 46aa4bae69f1a9a73965b863557cd359161ff7d6 Author: Henrique Alves Date: Tue Dec 17 21:02:05 2019 +0000 Adding resource adjustments to allow fcst step to run the couple FV3-WW3 system. Adding the WAV_POST_SBS job, modifying waveprep to use new unified point output grid tag uoutpGRD commit b8cff19a2cea88b1f73e0828f9c47b6b85f20ceb Author: Henrique Alves Date: Mon Dec 16 06:13:26 2019 +0000 Fixing issues with waveinit and waveprep and attempting to run gdasfcst. commit bcfcd23cc11644dc138b1d4aa19ad22a3c3de53c Author: kate.friedman Date: Fri Dec 13 20:31:18 2019 +0000 Added additional wave jobs to setup scripts commit 290bc09e17bb41ce94f4a98c7e942a0e67757093 Author: kate.friedman Date: Fri Dec 13 19:04:38 2019 +0000 Adding files for additional wave jobs commit 23ea303716477284c74b187d4142f40c4f4a2e1b Author: Henrique Alves Date: Thu Dec 12 05:19:27 2019 +0000 Updating build_fv3.sh to add correctly ww3 compile options commit 212aeb449300a0795d8d0055ab6f12f39bfc4d98 Author: Henrique Alves Date: Thu Dec 12 02:58:50 2019 +0000 Adding WW3 as component in the gfsv16 for enabling coupled run. commit 6e450a8947967795a9fab9da2b5ebc4d415c0387 Author: Henrique Alves Date: Thu Dec 12 02:51:08 2019 +0000 Changes to make wave init functional end-to-end under GFSv16 workflow. Removing parm/wave directory (to be moved to GEFS repo, in GFSv16 parm were converted to config) commit c4b7e07b355b9e92a8070659210cfd95961a1b05 Author: kate.friedman Date: Wed Dec 11 15:01:28 2019 +0000 Added wave tasks to free-forecast mode and adjusted cycled mode dependencies commit e36fed5ffaed435a2d774fa859b0aebd62de149b Author: Henrique Alves Date: Wed Dec 11 13:27:52 2019 +0000 Modifying spec of unified output point parameter from buoy to uoutpGRD to avoid errors as parameter buoy is also used to label output points themselves commit 126bd34442e34862da1ac1eaf00a2d9903cdd595 Author: Henrique Alves Date: Tue Dec 10 18:44:27 2019 +0000 Partial changes to add side-by-side post commit 66add7f33225e5bef12c941fd27b9c0539b155df Author: kate.friedman Date: Tue Dec 10 17:18:00 2019 +0000 Added wave rocoto job scripts, configs, and setup criteria commit b6a2742ebdce9e522a9b6f5af77d3f8a379c8413 Author: Henrique Alves Date: Mon Dec 9 16:36:20 2019 +0000 Adding changes to ensure RTOFS current files are read and prepared for coupled GFS runs. Several changes to add side-by-side post. commit 9d447f9363e999d5a7f5ff337c68a225a137bad9 Author: Henrique Alves Date: Fri Dec 6 20:48:56 2019 +0000 Reconciling exglobal_fcst, checkout and link scripts to allow running both GFS and GEFS configs. Adding ability to link out_grd/pnt wave files to COMOUTWW3 commit db1d3be780e8bcee8ac1352d5d8fe4b604d330f9 Merge: 25e423bd 37a3bd27 Author: Henrique Alves Date: Fri Dec 6 15:51:10 2019 +0000 Merging feature/gfsv16b 37a3bd27 into feature/wave2global commit 25e423bd5c804c73b2f9ba12ed3c29d4d2e13abd Author: Henrique Alves Date: Thu Dec 5 19:50:43 2019 +0000 Cleaning up (changing wavemodTAG to WAV_MOD_TAG etc) commit eb2197553c83c8c2c7b8233781fa8caac2bf7fe7 Author: Henrique Alves Date: Thu Dec 5 05:58:01 2019 +0000 Adjustments to adding stats step commit ed084df89006436392777bdce7502b426c505a8c Author: Henrique Alves Date: Mon Dec 2 16:05:53 2019 +0000 Changes made to file names to adjust to wave scripts ported to global-workflow commit 49d7b118c6ee980ad2c3e1e5d8e960926fe5cc1f Author: Henrique Alves Date: Wed Nov 27 17:09:34 2019 +0000 Saving changes to add gwes POST to global-workflow commit f70a299d03e1c5c902a5b6c65514a2509e72ea82 Author: Henrique Alves Date: Thu Nov 21 02:29:45 2019 +0000 Modified wave PREP step config files for accommodating move of wave j-jobs, ex/ush scripts, fix_wave and parm files into the global-workflow structure. Changed COMICE to COMINice in wave scripts. Renamed ush scripts for direct lreationship to WW3 package program names (eg ww3_prnc) commit 248fd28a4603d23b1ff1ee353e0c521ea36041d5 Author: Henrique Alves Date: Tue Nov 19 22:46:04 2019 +0000 Adding changes that make INIT step functional for GEFS after merging wave j-jobs, ex/ush-scripts, parm, fix into global-workflow commit 79c318ddd22a52f9340808c9d9768b866dc39880 Author: Henrique Alves Date: Fri Nov 8 17:53:47 2019 +0000 Finalizing changes to retain history relative to GEFS/scripts source for exwave scripts commit 4388122b7ca0e32caec0e44435c2ec4434decf4d Author: Henrique Alves Date: Fri Nov 8 17:50:52 2019 +0000 Staging intermediate name change to retain history in exwave scripts relative to source GEFS/scripts commit f4c5e2c214145d165c8de73a0326c7d30285c3ac Author: Henrique Alves Date: Fri Nov 8 17:48:08 2019 +0000 Staging removal of obsolete files commit 6b8321b25ac9291552addec1bb5a4b2eec5eff9c Author: Henrique Alves Date: Fri Nov 8 15:10:24 2019 +0000 Removing extensions gefs/gens (all will be set at the appropriate j-job, links will be made to fix_wave.), making error reporting codes consistent acrros ex-scripts. commit 1800288101815c3adf17ee071d3d04ab525e154b Author: Henrique Alves Date: Fri Nov 8 13:48:18 2019 +0000 Renaming after generalization of calls to ex-scripts from GEFS workflow commit 7bae5b660bbb12efbd7200180d5435106ecc0e85 Author: Henrique Alves Date: Fri Nov 8 13:30:06 2019 +0000 Changing extension to gens matchin NCO parameter, which is set to gens for all gefs-related runs. commit 4bd4562e77da8552ceeac317c945570afc389ddc Author: Henrique Alves Date: Thu Nov 7 21:48:25 2019 +0000 Adding wave components to global-workflow (fix/wave, jobs, parm/wave, scripts/exwave* ush/wave_*). Massive changes still required for this to work. Paths need to be re-routed, script names need to be adjusted, hard-wired ush-script parameters need to be moved into parm/wave. commit fb95cd3c304b623bf95e2618b71b2e649751f685 Author: Walter Kolczynski Date: Tue Nov 5 05:34:56 2019 +0000 Change way wave restart files are produced In order to facilitate faster cycle turnover, instead of copying the wave restart files at the end of the forecast, symbolic links to the final destination are created before the forecast. This allows the restart directory to be populated immediately, so that the next cycle can commence while the current cycle finishes. Change-Id: I887dddec88e32ad08fefc5a8080c9f553965b9dc commit 750ba243dc1040eb008bda9eddeecc313de5ad28 Author: Walter Kolczynski Date: Sun Nov 3 08:07:06 2019 +0000 Update NEMS app for new WW3 version An undocumented change to how the peak frequencies were computed in WW3 resulted in unrealistically large peak frequencies. Coupled with a bug in Jasper, this caused crashes during grib2 encoding. Change-Id: Iface172ec8f6da816a4b56af73e8fb091dd57584 commit c2696f6951c8e0dca1b853009a88cef6846a02bf Author: Walter Kolczynski Date: Fri Nov 1 10:02:45 2019 +0000 Update NEMS app to incorporate latest WW3 changes The NEMS app is updated to incorporate changes to WW3 for the grib2 encoders and make profiling optional to improve performance. Change-Id: I6b7c96128f116698c27f037129db4ba8f591551b commit 632bfe92a6912e981b98f84d9782adcd5650d9f9 Author: Walter.Kolczynski Date: Fri Oct 25 16:56:16 2019 +0000 Fix previous merge enabling Hera Change-Id: I277baff3dd5c8f7485d4be1c8052837c3fbadc46 commit 78924915ed4add9511c6b283119015f048201baf Author: Walter.Kolczynski Date: Fri Oct 25 13:47:42 2019 +0000 Add capability to run on Hera Change-Id: I61da1adcfa1b0ff09222c15b2c4e58c6af8a853b commit e748ef122e196028816da5aa3a05f0179cabddc9 Author: Walter Kolczynski Date: Sun Oct 20 08:06:20 2019 +0000 Update NEMS app to version for GEFS retrospective The NEMS app is updated to check out the version for phase 2 of the GEFS v12 retrospective. Change-Id: Iddfa75a78965cf19eca2bca7f9fbbf45dcd457a6 commit b7dc7e6124dbde4b0c41cfaca6a457e5bc995066 Author: Walter Kolczynski Date: Sat Oct 12 18:29:16 2019 +0000 Update app for WW3 OMP fix Change-Id: I41523cb5313b18eba4301d599f99ec67ea6c9bc4 Refs: #58418 commit 59883cb4ff503fd7a92e371389a5ed4f467da5e2 Author: Walter Kolczynski Date: Thu Oct 10 01:51:06 2019 +0000 Fix bug with creating WW3 restart directory Change-Id: I7002db6175d4e994318e6274071dca1475ae3121 Refs: #58418 commit 519a99c671db60eb8d994190fa75102c8a4b01e4 Author: Walter Kolczynski Date: Wed Oct 9 19:10:26 2019 +0000 Update NEMS app version for Hera port Change-Id: I19cdf40e4c0a147d138cc59fc720b7608b51554a Refs: #58418 commit a9309088cee6591eda24318af55f8566c79285a5 Author: Walter Kolczynski Date: Mon Oct 7 20:18:14 2019 +0000 Update post to more recent version Change-Id: I8aab5d3d532fbfc444abd4c1c71e10f55906500c Refs: #58418 commit 735f421618849dbda49a1c30ccaabd490e333544 Merge: 865ea5aa c68728b5 Author: Walter Kolczynski Date: Thu Oct 3 17:05:25 2019 +0000 Merge branch 'develop' into lgan_fv3_ww3 commit 865ea5aa37fc866b83dc09fb10f6b1021dc09eee Author: Walter Kolczynski Date: Thu Oct 3 16:54:24 2019 +0000 Update NEMS app checkout to ver with GSDCHEM 0.8.8 Change-Id: Idd286ae4300f0b067d3f38a2aa76b27f2d3911df Refs: #58418, #62104 commit ad1b52817e367f2dd09f961618af72ba3cd84939 Author: Walter Kolczynski Date: Tue Oct 1 20:19:19 2019 +0000 Fix issue with exglobal not creating wave restart directory The wave restart directory was not being created by exglobal, causing the copying of wave restart files to fail. Modified the directory creation to use the new restart subdirectory. Change-Id: I77fe3a72249efd3e688ece47f0778b30e034ff24 Refs: #58418 commit 8a312b398ea79bacef025d19252ef0fd98cffa82 Author: Henrique Alves Date: Wed Sep 25 02:26:51 2019 +0000 global-workflow wave component: adding rundata and restart directories to store model run files (binary ww3, log, forcing inputs etc) and restart following new directory tree structure. commit 67adca4ac31d1219c173d3af0c7a905f4e7d3382 Author: Henrique Alves Date: Wed Sep 18 02:47:03 2019 +0000 Winding back changes to wave-related parms PDY_PCYC etc, and adding correct parm cyc defining directory, that was erroneously pointing to non-existent parm cycm1. commit 65fe319c30dfceff7ef265c3f5aa6eb2a1ce049e Author: Henrique Alves Date: Tue Sep 17 21:44:59 2019 +0000 Adding parms CDATE_PCYC, PDY_PCYC and cyc_pcyc to exglobal_fcst_nemsfv3gfs.sh for setting location of wave component restart file as a function of time between cycles commit af747496f8b4b2de287d061a5094135083410803 Author: Walter Kolczynski Date: Thu Sep 5 18:59:49 2019 +0000 Update NEMS app version to a master commit Previously the checkout for the FV3-GSDCHEM-WW3 app temporarily pointed to an interim commit that included GSDCHEM 0.87 and used ESMF 8.0.0bs40+ to facilitate development while waiting for the app to merge a full update to master. That update has now been made to the app master, so the checkout commit is now updated to point to that new version. The new FV3-GSDCHEM-WW3 app commit updates to GSDCHEM 0.87 and ESMF 8.0.0bs47, and also updates the other components to recent development versions. Change-Id: I0318535aa7564b48567a6442cdda074007095c03 Refs: #58418 commit 57f74c842ba30e40f6e2c3ba1bd4802388aa068f Author: Walter Kolczynski Date: Fri Aug 23 16:34:57 2019 +0000 Update UFS_utils version to work with GEFS The GEFS init system requires changes to global_chgres_driver.sh that are not present in v1.0.0. The UFS_utils version is updated to the current tip of the develop branch, which includes the necessary changes. Change-Id: I097b16342ca081e867786410a6ac9916fd5576fb Refs: #58418 commit 6fddfaf8212d29ee9a1eddbdbc7f710d2a19e426 Merge: 9fd2726d 9dc4c7a1 Author: Walter.Kolczynski Date: Mon Aug 12 18:55:36 2019 +0000 Merge branch 'develop' into lgan_fv3_ww3 Change-Id: I068787c69f6610fa6896af16775378286d3e948b commit 9fd2726d64d62b7fc6196fb74060dc2ec15b3dab Author: Walter.Kolczynski Date: Mon Aug 5 18:24:21 2019 +0000 Update FV3-GSDCHEM-WW3 version Updated the checkout script to fetch a newer commit of the FV3-GSDCHEM-WW3 app. This newer version uses ESMF v8.0.0bs42, updates GSDCHEM to 0.87, WW3 to an OMP-enabled version, and FV3 to the latest develop commit. Change-Id: I7c16511e2d5d183abd96371afd0d0b237e5349aa Refs: #58418, #62104 commit f123abb87a153876a1cdcff5285d7c4dd5d66567 Merge: f4b47b5d dcfa5eee Author: Walter.Kolczynski Date: Wed Jul 31 18:31:17 2019 +0000 Merge remote-tracking branch 'origin/develop' into lgan_fv3_ww3 commit f4b47b5db243f4f0d172b3430e8a6f494946ca2a Author: Henrique Alves Date: Thu Jul 18 21:49:50 2019 +0000 Cleaning up esmf profiling parms commit 056c6cd12288a6a8f3b1fd0fa6165f0b14667a71 Merge: 2bb5bacc 01b12587 Author: Henrique Alves Date: Thu Jul 18 21:45:33 2019 +0000 Merge branch 'lgan_fv3_ww3' of gerrit:global-workflow into lgan_fv3_ww3 commit 2bb5bacc59c830276c91c03ab490cfbc4781844e Author: Henrique Alves Date: Thu Jul 18 21:45:23 2019 +0000 Branch lgan_fv3_ww3: changing exglobal_fcst to allow writing wave component restart files following number of daily cycles (gfs_cyc), and creating proper comout for wave restart files commit 01b12587ae6567677d95e15e990397f142c53633 Author: Walter.Kolczynski Date: Tue Jul 16 17:49:44 2019 +0000 Fix clean arguments for building fv3 The arguments used when building the FV3-GSDCHEM-WW3 app used named arguments, but the script expects arguments without names. Change-Id: Ibc98a4c883f4f28c361104d03e4b17fb070dc7f8 Refs: #58418 commit 7135cc11b621165edc1fd7677af053e915ceb364 Author: Henrique Alves Date: Mon Jul 8 17:03:55 2019 +0000 Small bugfix on copying out_pnt for wave output commit 65c3720495cc39ddd64c21b4c4e0080fccd877b0 Author: Henrique Alves Date: Mon Jul 8 15:07:19 2019 +0000 Changes made to exglobal_fcst to streamline linking and copying IO from/to wave model component. commit 02704a3f0312383919adda7245300a5a60b0ecd8 Author: Walter.Kolczynski Date: Fri Jun 28 16:21:46 2019 +0000 Remove CDUMP requirement to use wave coupling Previously, the needed operations for wave coupling were only completed if CDUMP="gfs", even if cplwav=".true.". The GEFS uses a CDUMP="gdas", so this requirement had to be removed for the GEFS to operate correctly. Change-Id: I50964a7fc064f439b6c879fb199d910b3f7eadfc Refs: #58418 commit a3033a05458b47548d404f82148423edda4946cd Merge: 6cfec865 6ca55564 Author: Walter.Kolczynski Date: Thu Jun 27 19:38:01 2019 +0000 Merge branch 'master' into lgan_fv3_ww3 Change-Id: I31ee2e1a55de3db34de2dbcb029224729c3c9074 commit 6cfec865ebb0bcad8f81ebea21bbe4d06531ac99 Author: Walter.Kolczynski Date: Thu Jun 27 19:15:50 2019 +0000 Update checkout script to get combined FV3-WW3-GSDCHEM app Replaces the FV3-only checkout with one that clones the combined FV3-WW3-GSDCHEM app, which can be run in any combination of those components. This unified app is required to run coupled forecasts. The app will build correctly in the normal way using build_fv3.sh (and build_all.sh). Post is also reverted to the mainline project rather than the gtg version which has more restricted access. NOTES ON APP VERSION: 1) App does not build successfully on WCOSS-Cray due to a module dependency issue that will be sorted out later. 2) As committed, the WW3 build requires a path of 79 characters or less. This will be addressed in a future NEMS commit. For now, you can fix this manually by updating L19 of fv3gfs.fd/NEMS/src/incmake/component_WW3.mk to a larger number before building. Change-Id: I6c6d000650dc4ee8b2bbc9997195c3dfa2690520 Refs: #58418 commit 9af1480dee2351b540e873f1daf9d26928ec472d Author: lin.gan Date: Thu May 2 17:55:16 2019 +0000 Update resource to use dynamically calculated lowres on Theia include expdir commit ced016bb8b949350f92730bb52ca74cd3b8d9ad2 Author: lin.gan Date: Thu May 2 17:50:27 2019 +0000 Update resource to use dynamically calculated lowres on Theia commit 76f877c57b8f965779725a48059f6a1dc83290fa Author: lin.gan Date: Wed May 1 19:03:13 2019 +0000 Coupled ww3 tested with FV3 GFS forecast certified commit 56c686d03a420bf7216aafb86a572a054fcecdc0 Author: lin.gan Date: Wed May 1 14:06:18 2019 +0000 Testing include ICE data commit c24de7af395edcf9cc9f341ee758d829d747b24f Author: lin.gan Date: Mon Apr 22 14:15:17 2019 +0000 First try see email for record commit e789501ca90f72d7ef121cee8023bb340c0b3416 Author: lin.gan Date: Mon Apr 22 14:12:40 2019 +0000 First try see email for record commit ca2f9c56badd8f92bb97a25234ab55fb86cced68 Author: fanglin.yang Date: Tue Apr 2 14:41:05 2019 +0000 update gsi tag from fv3da.v1.0.42 to fv3da.v1.0.43 to remove a dead link in Ozomon commit dbad0c88855ff9fd06cf90a70d0ee112a065febf Author: fanglin.yang Date: Tue Apr 2 00:57:27 2019 +0000 update POC for DA in release note commit f8f108f4a7030a3a3619fec52d6d6d8d961fab65 Author: fanglin.yang Date: Tue Apr 2 00:46:18 2019 +0000 merge NCO's back to q2fy19_nco branch. Update release notes commit cbbb49fb1b60f7bdde5f1bb66be79c86233b3122 Author: russ.treadon Date: Fri Mar 29 21:18:02 2019 +0000 Update DA tag to fv3da.v1.0.42 commit 23a45dea9fd33480ab250b37e4d75d10500726cb Author: fanglin.yang Date: Fri Mar 29 14:30:17 2019 +0000 Update model tag to nemsfv3gfs_beta_v1.0.18 to 1) correct a bug in precip units in the computation of frozen precipitation flag (srflag), 2) write fields that are continuously accumulated in model integration in restart files so that after a restart their acummulated values can be read in. (FV3 Issue #61788) commit d9b5538244fee0876185fa90f0b40b1869dc2619 Author: fanglin.yang Date: Thu Mar 28 20:46:52 2019 +0000 update release note commit abff7d1d77b94b28aca48b996644df6e3fa4ccda Author: fanglin.yang Date: Thu Mar 28 20:45:36 2019 +0000 replace current ecflow/def file swith NCO's copy commit 69c06ea44c639fd4716001edead49b5e2613a03d Author: fanglin.yang Date: Tue Mar 26 02:25:15 2019 +0000 Squashed commit of the following from branch q2fy19_nco_rst Add restart capability of running GFS long forecast from the end or a failing point of last attempt. modified: jobs/JGLOBAL_FORECAST, parm/config/config.base.nco.static parm/config/config.fcst, and scripts/exglobal_fcst_nemsfv3gfs.sh. restart_interval_gfs is used to control the frequency of writing out restart ICs, which are saved under ROTDIR for emc parallels and NWGES for NCO production. exglobal_fcst_nemsfv3gfs.sh script has been modified to autimatically detect if the model should execute as a cold, or warm start, or as rerun. If it is a rerun, the script will look for saved ICs that is restart_interval_gfs hours back from the last ending point. use 8x24 instead of 12x16 layout in config.fv3 for C768 -- Matt Pyle indicated this will actually speed up a 16-day forecasts by about 2 minutes per his test update to model tag nemsfv3gfs_beta_v1.0.17 to address restart I/O issues #60879 commit e01d1df6a81f7998574fa1145e05955f93ecf14d Author: russ.treadon Date: Mon Mar 25 23:16:05 2019 +0000 Update DA tag to fv3da.v1.0.41 (Q3FY19 GDAS observation upgrade) commit 61b9da080d146fdc16c63aabdb4734ad5cbce8b9 Author: fanglin.yang Date: Tue Mar 5 18:59:56 2019 +0000 minor updates to resources usages in config.vrfy, config.resources and config.post on computers other an Dell to be consistent with the master repository --- .gitignore | 2 + Externals.cfg | 53 + README.md | 49 + driver/product/run_postsnd.sh.dell | 18 +- env/HERA.env | 2 +- env/JET.env | 8 +- env/WCOSS_C.env | 7 +- env/WCOSS_DELL_P3.env | 8 +- jobs/JGLOBAL_FORECAST | 17 +- jobs/JWAVE_INIT | 93 ++ jobs/JWAVE_POST_SBS | 117 ++ jobs/JWAVE_PREP | 126 +++ jobs/rocoto/analcalc.sh | 13 + jobs/rocoto/analdiag.sh | 13 + jobs/rocoto/arch.sh.orig | 351 ++++++ jobs/rocoto/arch_BACKUP_240511.sh | 351 ++++++ jobs/rocoto/arch_BASE_240511.sh | 318 ++++++ jobs/rocoto/arch_LOCAL_240511.sh | 1 + jobs/rocoto/arch_REMOTE_240511.sh | 351 ++++++ jobs/rocoto/arch_emc.sh | 37 +- jobs/rocoto/ediag.sh | 13 + jobs/rocoto/prep.sh | 26 +- jobs/rocoto/waveinit.sh | 21 + jobs/rocoto/wavepostsbs.sh | 21 + jobs/rocoto/waveprep.sh | 21 + modulefiles/module_nemsutil.hera | 10 - modulefiles/module_nemsutil.wcoss | 13 - modulefiles/module_nemsutil.wcoss_cray | 17 - .../module_nemsutil.wcoss_cray_userlib | 19 - modulefiles/module_nemsutil.wcoss_dell_p3 | 12 - modulefiles/modulefile.grib_util.wcoss | 32 - modulefiles/modulefile.grib_util.wcoss_cray | 22 - .../modulefile.grib_util.wcoss_cray_userlib | 22 - .../modulefile.grib_util.wcoss_dell_p3 | 15 - modulefiles/modulefile.prod_util.wcoss_cray | 11 - .../modulefile.prod_util.wcoss_cray_userlib | 13 - .../modulefile.prod_util.wcoss_dell_p3 | 6 - modulefiles/modulefile.wgrib2.wcoss | 24 - modulefiles/modulefile.wgrib2.wcoss_cray | 13 - .../modulefile.wgrib2.wcoss_cray_userlib | 15 - modulefiles/modulefile.wgrib2.wcoss_dell_p3 | 11 - parm/config/config.anal | 2 +- parm/config/config.analcalc | 13 + parm/config/config.analdiag | 13 + parm/config/config.base.emc.dyn | 16 +- parm/config/config.base.nco.static | 14 +- parm/config/config.ediag | 13 + parm/config/config.efcs | 9 +- parm/config/config.epos | 3 + parm/config/config.fcst | 25 +- parm/config/config.fv3 | 15 +- parm/config/config.resources | 129 ++- parm/config/config.wave | 131 +++ parm/config/config.waveinit | 14 + parm/config/config.wavepostsbs | 12 + parm/config/config.waveprep | 29 + scripts/exgfs_postsnd.sh.ecf | 6 +- scripts/exglobal_fcst_nemsfv3gfs.sh | 278 +++-- scripts/exwave_init.sh | 230 ++++ scripts/exwave_post_sbs.sh | 771 +++++++++++++ scripts/exwave_prep.sh | 1007 +++++++++++++++++ sorc/build_all.sh | 38 +- sorc/build_fv3.sh | 2 +- sorc/build_grib_util.sh | 88 -- sorc/build_prod_util.sh | 47 - sorc/checkout.sh | 7 +- sorc/fv3gfs_build.cfg | 2 - sorc/link_fv3gfs.sh | 28 +- sorc/partial_build.sh | 4 +- ush/gfs_bfr2gpk.sh | 2 +- ush/hpssarch_gen.sh | 40 +- ush/rocoto/setup_workflow.py | 226 +++- ush/rocoto/setup_workflow_fcstonly.py | 148 ++- ush/wave_ens_bull.sh | 261 +++++ ush/wave_ens_stats.sh | 254 +++++ ush/wave_grib2.sh | 225 ++++ ush/wave_grib2_cat.sh | 188 +++ ush/wave_grib2_sbs.sh | 222 ++++ ush/wave_grid_interp.sh | 209 ++++ ush/wave_grid_interp_sbs.sh | 217 ++++ ush/wave_grid_moddef.sh | 136 +++ ush/wave_outp_spec.sh | 260 +++++ ush/wave_prnc_cur.sh | 75 ++ ush/wave_prnc_ice.sh | 198 ++++ ush/wave_tar.sh | 231 ++++ 85 files changed, 7520 insertions(+), 610 deletions(-) create mode 100644 Externals.cfg create mode 100644 README.md create mode 100755 jobs/JWAVE_INIT create mode 100755 jobs/JWAVE_POST_SBS create mode 100755 jobs/JWAVE_PREP create mode 100755 jobs/rocoto/analcalc.sh create mode 100755 jobs/rocoto/analdiag.sh create mode 100755 jobs/rocoto/arch.sh.orig create mode 100755 jobs/rocoto/arch_BACKUP_240511.sh create mode 100644 jobs/rocoto/arch_BASE_240511.sh create mode 100644 jobs/rocoto/arch_LOCAL_240511.sh create mode 100644 jobs/rocoto/arch_REMOTE_240511.sh create mode 100755 jobs/rocoto/ediag.sh create mode 100755 jobs/rocoto/waveinit.sh create mode 100755 jobs/rocoto/wavepostsbs.sh create mode 100755 jobs/rocoto/waveprep.sh delete mode 100644 modulefiles/module_nemsutil.hera delete mode 100644 modulefiles/module_nemsutil.wcoss delete mode 100644 modulefiles/module_nemsutil.wcoss_cray delete mode 100644 modulefiles/module_nemsutil.wcoss_cray_userlib delete mode 100644 modulefiles/module_nemsutil.wcoss_dell_p3 delete mode 100644 modulefiles/modulefile.grib_util.wcoss delete mode 100644 modulefiles/modulefile.grib_util.wcoss_cray delete mode 100644 modulefiles/modulefile.grib_util.wcoss_cray_userlib delete mode 100644 modulefiles/modulefile.grib_util.wcoss_dell_p3 delete mode 100644 modulefiles/modulefile.prod_util.wcoss_cray delete mode 100644 modulefiles/modulefile.prod_util.wcoss_cray_userlib delete mode 100644 modulefiles/modulefile.prod_util.wcoss_dell_p3 delete mode 100644 modulefiles/modulefile.wgrib2.wcoss delete mode 100644 modulefiles/modulefile.wgrib2.wcoss_cray delete mode 100644 modulefiles/modulefile.wgrib2.wcoss_cray_userlib delete mode 100644 modulefiles/modulefile.wgrib2.wcoss_dell_p3 create mode 100755 parm/config/config.analcalc create mode 100755 parm/config/config.analdiag create mode 100755 parm/config/config.ediag create mode 100755 parm/config/config.wave create mode 100755 parm/config/config.waveinit create mode 100755 parm/config/config.wavepostsbs create mode 100755 parm/config/config.waveprep create mode 100755 scripts/exwave_init.sh create mode 100755 scripts/exwave_post_sbs.sh create mode 100755 scripts/exwave_prep.sh delete mode 100755 sorc/build_grib_util.sh delete mode 100755 sorc/build_prod_util.sh create mode 100755 ush/wave_ens_bull.sh create mode 100755 ush/wave_ens_stats.sh create mode 100755 ush/wave_grib2.sh create mode 100755 ush/wave_grib2_cat.sh create mode 100755 ush/wave_grib2_sbs.sh create mode 100755 ush/wave_grid_interp.sh create mode 100755 ush/wave_grid_interp_sbs.sh create mode 100755 ush/wave_grid_moddef.sh create mode 100755 ush/wave_outp_spec.sh create mode 100755 ush/wave_prnc_cur.sh create mode 100755 ush/wave_prnc_ice.sh create mode 100755 ush/wave_tar.sh diff --git a/.gitignore b/.gitignore index 2c22b50994..521f6c4a89 100644 --- a/.gitignore +++ b/.gitignore @@ -9,6 +9,8 @@ exec/ # Ignore sorc folders from externals sorc/logs/ sorc/fv3gfs.fd/ +sorc/gfs_post.fd/ sorc/gsi.fd/ +sorc/ufs_utils.fd/ sorc/gfs_wafs.fd/ sorc/gldas.fd/ diff --git a/Externals.cfg b/Externals.cfg new file mode 100644 index 0000000000..f587521354 --- /dev/null +++ b/Externals.cfg @@ -0,0 +1,53 @@ +# External sub-modules of global-workflow + +[FV3GFS] +tag = GFS.v16.0.1 +local_path = sorc/fv3gfs.fd +repo_url = https://github.com/ufs-community/ufs-weather-model.git +protocol = git +required = True + +[GSI] +tag = gfsda.v16.0.0 +local_path = sorc/gsi.fd +repo_url = ssh://vlab.ncep.noaa.gov:29418/ProdGSI +protocol = git +required = True + +[GLDAS] +tag = gldas_gfsv16_release.v1.0.0 +local_path = sorc/gldas.fd +repo_url = https://github.com/NOAA-EMC/GLDAS.git +protocol = git +required = True + +[EMC_post] +tag = upp_gfsv16_release.v1.0.5 +local_path = sorc/gfs_post.fd +repo_url = https://github.com/NOAA-EMC/EMC_post.git +protocol = git +required = True + +[UFS_UTILS] +branch = release/ops-gfsv16 +local_path = sorc/ufs_utils.fd +repo_url = https://github.com/NOAA-EMC/UFS_UTILS.git +protocol = git +required = True + +[EMC_verif-global] +tag = verif_global_v1.6.0 +local_path = sorc/verif-global.fd +repo_url = ssh://vlab.ncep.noaa.gov:29418/EMC_verif-global +protocol = git +required = True + +[EMC_gfs_wafs] +tag = gfs_wafs.v5.0.11 +local_path = sorc/gfs_wafs.fd +repo_url = https://github.com/NOAA-EMC/EMC_gfs_wafs.git +protocol = git +required = False + +[externals_description] +schema_version = 1.0.0 diff --git a/README.md b/README.md new file mode 100644 index 0000000000..3fac2751aa --- /dev/null +++ b/README.md @@ -0,0 +1,49 @@ +# global-workflow +Global Superstructure/Workflow currently supporting the Finite-Volume on a Cubed-Sphere Global Forecast System (FV3GFS) + +The global-workflow depends on the following prerequisities to be available on the system: + +* workload management platform / scheduler - LSF or SLURM +* workflow manager - ROCOTO (https://github.com/christopherwharrop/rocoto) +* modules - NCEPLIBS (various), esmf v8.0.0bs48, hdf5, intel/ips v18, impi v18, wgrib2, netcdf v4.7.0, hpss, gempak (see module files under /modulefiles for additional details) +* manage_externals - A utility from ESMCI to checkout external dependencies. Manage_externals can be obtained at the following address and should be in the users PATH: https://github.com/ESMCI/manage_externals + +The global-workflow current supports the following machines: + +* WCOSS-Dell +* WCOSS-Cray +* Hera + +## Build global-workflow: + +### 1. Check out components + +The global-workflow uses the manage_externals utility to handle checking out its components. The manic-v1.1.8 manage_externals tag is supported. The manage_externals utility will be replacing the current checkout.sh script. + +Run manage_externals (checkout_externals) while at top of clone: + +``` +$ checkout_externals -e Externals.cfg +``` + +If checkout_externals is not in your $PATH then use full path to it: + +* WCOSS-Dell: /gpfs/dell2/emc/modeling/noscrub/emc.glopara/git/manage_externals/manic-v1.1.8/checkout_externals +* WCOSS-Cray: /gpfs/hps3/emc/global/noscrub/emc.glopara/git/manage_externals/manic-v1.1.8/checkout_externals +* Hera: /scratch1/NCEPDEV/global/glopara/git/manage_externals/manic-v1.1.8/checkout_externals + +### 2. Build components + +While in /sorc folder: +``` +$ sh build_all.sh +``` + +### 3. Link components + +While in /sorc folder: +``` +$ sh link_fv3gfs.sh emc $MACHINE +``` + +...where $MACHINE is "dell", "cray", or "hera". diff --git a/driver/product/run_postsnd.sh.dell b/driver/product/run_postsnd.sh.dell index 57ef3cb10c..8aebd95111 100755 --- a/driver/product/run_postsnd.sh.dell +++ b/driver/product/run_postsnd.sh.dell @@ -6,9 +6,9 @@ #BSUB -W 01:30 #BSUB -q dev #BSUB -P GFS-DEV -#BSUB -cwd /gpfs/dell2/emc/verification/noscrub/Guang.Ping.Lou/gfsv16bnetcdf3/driver/product +#BSUB -cwd /gpfs/dell2/emc/verification/noscrub/Guang.Ping.Lou/gfsv16bnetcdf_mpmd/driver/product #BSUB -R span[ptile=4] -#BSUB -n 32 +#BSUB -n 40 #BSUB -R affinity[core(1):distribute=balance] ############################################ @@ -53,8 +53,8 @@ export MP_STDOUTMODE=ordered machine="WCOSS_DELL_P3" #machine="THEIA" #machine="JET" -export npe_postsnd=32 -export npe_postsndcfp=10 +export npe_postsnd=40 +export npe_postsndcfp=9 ##export OUTPUT_FILE="nemsio" export OUTPUT_FILE=${OUTPUT_FILE:-netcdf} if [ $machine == "WCOSS_C" ]; then @@ -72,7 +72,7 @@ elif [ $machine == "WCOSS_DELL_P3" ]; then ##For WCOSS-Dell ################ if [ $OUTPUT_FILE == "netcdf" ]; then export FHMAX_HF_GFS=120 - export FHOUT_HF_GFS=1 + export FHOUT_HF_GFS=3 export FHOUT_GFS=3 else export FHMAX_HF_GFS=120 @@ -115,7 +115,7 @@ export DATA_IN=${DATA_IN:-/gpfs/dell2/ptmp/$USER} export DATA=$DATA_IN/postsnd.${pid} mkdir -p $DATA cd $DATA -export PDY=20200215 +export PDY=20200315 export cyc=00 export STARTHOUR=00 export ENDHOUR=180 @@ -151,7 +151,7 @@ mkdir -p $pcom ################################### export HOMEgfs=/gpfs/dell2/emc/modeling/noscrub/emc.glopara/git/global-workflow/feature_gfsv16b -export HOMEbufrsnd=/gpfs/dell2/emc/verification/noscrub/Guang.Ping.Lou/gfsv16bnetcdf3 +export HOMEbufrsnd=/gpfs/dell2/emc/verification/noscrub/Guang.Ping.Lou/gfsv16bnetcdf_mpmd ##export HOMEbufrsnd=/gpfs/dell2/emc/verification/noscrub/Guang.Ping.Lou/fv3gfs ##export HOMEbufrsnd=/gpfs/hps3/emc/meso/noscrub/Guang.Ping.Lou/fv3gfs @@ -159,8 +159,8 @@ export HOMEbufrsnd=/gpfs/dell2/emc/verification/noscrub/Guang.Ping.Lou/gfsv16bne # Define COM Directories ############################## if [ $OUTPUT_FILE == "netcdf" ]; then -##export COMIN=/gpfs/dell3/ptmp/emc.glopara/ROTDIRS/v16rt2/${RUN}.${PDY}/$cyc -export COMIN=/gpfs/dell2/emc/verification/noscrub/Guang.Ping.Lou/gfs_v16_data/2020021500 +export COMIN=/gpfs/dell3/ptmp/emc.glopara/ROTDIRS/v16rt2/${RUN}.${PDY}/$cyc +##export COMIN=/gpfs/dell2/emc/verification/noscrub/Guang.Ping.Lou/gfs_v16_data/2020021500 else export COMIN=/gpfs/dell1/nco/ops/com/gfs/prod/${RUN}.${PDY}/$cyc fi diff --git a/env/HERA.env b/env/HERA.env index eac63ada5d..a0457a1fc4 100755 --- a/env/HERA.env +++ b/env/HERA.env @@ -4,7 +4,7 @@ if [ $# -ne 1 ]; then echo "Must specify an input argument to set runtime environment variables!" echo "argument can be any one of the following:" - echo "anal fcst post vrfy" + echo "anal fcst post vrfy metp" echo "eobs eupd ecen efcs epos" echo "postsnd awips gempak" exit 1 diff --git a/env/JET.env b/env/JET.env index ab1966b628..c9ab6c3c7f 100755 --- a/env/JET.env +++ b/env/JET.env @@ -93,8 +93,12 @@ elif [ $step = "fcst" ]; then export NTHREADS_FV3=${nth_fv3:-$nth_max} [[ $NTHREADS_FV3 -gt $nth_max ]] && export NTHREADS_FV3=$nth_max export cores_per_node=$npe_node_max - export APRUN_FV3="$launcher ${npe_fv3:-${npe_fcst:-$PBS_NP}}" - + #export APRUN_FV3="$launcher ${npe_fv3:-${npe_fcst:-$PBS_NP}}" + if [ $CDUMP = "gdas" ]; then + export APRUN_FV3="$launcher ${npe_fcst:-$PBS_NP}" + else + export APRUN_FV3="$launcher ${npe_fcst_gfs:-$PBS_NP}" + fi export NTHREADS_REGRID_NEMSIO=${nth_regrid_nemsio:-1} [[ $NTHREADS_REGRID_NEMSIO -gt $nth_max ]] && export NTHREADS_REGRID_NEMSIO=$nth_max export APRUN_REGRID_NEMSIO="$launcher $LEVS" diff --git a/env/WCOSS_C.env b/env/WCOSS_C.env index df9463c821..c63ba25686 100755 --- a/env/WCOSS_C.env +++ b/env/WCOSS_C.env @@ -110,7 +110,12 @@ elif [ $step = "fcst" ]; then export NTHREADS_FV3=${nth_fv3:-$nth_max} [[ $NTHREADS_FV3 -gt $nth_max ]] && export NTHREADS_FV3=$nth_max export cores_per_node=$npe_node_max - export APRUN_FV3="$launcher -j 1 -n ${npe_fv3:-$npe_fcst} -N $npe_node_fcst -d $NTHREADS_FV3 -cc depth" + #export APRUN_FV3="$launcher -j 1 -n ${npe_fv3:-$npe_fcst} -N $npe_node_fcst -d $NTHREADS_FV3 -cc depth" + if [ $CDUMP = "gdas" ]; then + export APRUN_FV3="$launcher -j 1 -n ${npe_fcst} -N $npe_node_fcst -d $NTHREADS_FV3 -cc depth" + else + export APRUN_FV3="$launcher -j 1 -n ${npe_fcst_gfs} -N $npe_node_fcst -d $NTHREADS_FV3 -cc depth" + fi export NTHREADS_REGRID_NEMSIO=${nth_regrid_nemsio:-1} [[ $NTHREADS_REGRID_NEMSIO -gt $nth_max ]] && export NTHREADS_REGRID_NEMSIO=$nth_max diff --git a/env/WCOSS_DELL_P3.env b/env/WCOSS_DELL_P3.env index 2fb3c4d8b6..aa59ea231a 100755 --- a/env/WCOSS_DELL_P3.env +++ b/env/WCOSS_DELL_P3.env @@ -110,8 +110,12 @@ elif [ $step = "fcst" ]; then export NTHREADS_FV3=${nth_fv3:-$nth_max} [[ $NTHREADS_FV3 -gt $nth_max ]] && export NTHREADS_FV3=$nth_max export cores_per_node=$npe_node_max - export APRUN_FV3="$launcher ${npe_fv3:-${npe_fcst:-$PBS_NP}}" - + if [ $CDUMP = "gdas" ]; then + #export APRUN_FV3="$launcher ${npe_fv3:-${npe_fcst:-$PBS_NP}}" + export APRUN_FV3="$launcher ${npe_fcst:-$PBS_NP}" + else + export APRUN_FV3="$launcher ${npe_fcst_gfs:-$PBS_NP}" + fi export NTHREADS_REGRID_NEMSIO=${nth_regrid_nemsio:-1} [[ $NTHREADS_REGRID_NEMSIO -gt $nth_max ]] && export NTHREADS_REGRID_NEMSIO=$nth_max export APRUN_REGRID_NEMSIO="$launcher $LEVS" diff --git a/jobs/JGLOBAL_FORECAST b/jobs/JGLOBAL_FORECAST index ef9b75f80d..be694dcf34 100755 --- a/jobs/JGLOBAL_FORECAST +++ b/jobs/JGLOBAL_FORECAST @@ -17,7 +17,15 @@ for config in $configs; do status=$? [[ $status -ne 0 ]] && exit $status done - +# Source additional configs +if [ ${DO_WAVE:-"NO"} = "YES" ]; then + configs="wave" + for config in $configs; do + . $config_path/config.$config + status=$? + [[ $status -ne 0 ]] && exit $status + done +fi ########################################## # Source machine runtime environment @@ -62,6 +70,13 @@ if [ $RUN_ENVIR = "nco" ]; then export RSTDIR=${GESROOT:?}/$envir fi +# Source additional configs +if [ ${DO_WAVE:-"NO"} = "YES" ]; then +# WAVE component directory + export WAV_MOD_ID=${WAV_MOD_ID:-wave} + export COMINWW3=${COMINWW3:-${ROTDIR:?}} + export COMOUTWW3=${COMOUTWW3:-${ROTDIR:?}} +fi ############################################## # Begin JOB SPECIFIC work diff --git a/jobs/JWAVE_INIT b/jobs/JWAVE_INIT new file mode 100755 index 0000000000..781c69c33e --- /dev/null +++ b/jobs/JWAVE_INIT @@ -0,0 +1,93 @@ +#!/bin/bash + +date +export PS4=' $SECONDS + ' +set -x -e + +############################# +# Source relevant config files +############################# +configs="base wave waveinit" +export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} +config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} +for config in $configs; do + . $config_path/config.$config + status=$? + [[ $status -ne 0 ]] && exit $status +done + +########################################## +# Source machine runtime environment +########################################## +. $HOMEgfs/env/${machine}.env waveinit +status=$? +[[ $status -ne 0 ]] && exit $status + +# PATH for working directory +export NET=${NET:-gfs} +export RUN=${RUN:-gfs} +export COMPONENTwave=${COMPONENTwave:-${RUN}wave} + +export HOMEgfs=${HOMEgfs:-$NWROOT/${NET}.${gfs_ver}} + +# Add default errchk = err_chk +export errchk=${errchk:-err_chk} + +# Set HOMEwave to HOMEgfs +HOMEwave=${HOMEwave:-${HOMEgfs}} + +# Create and go to DATA directory +export DATA=${DATA:-${DATAROOT:?}/${jobid}} +mkdir -p $DATA +cd $DATA + +cyc=${cyc:-00} +export cycle=${cycle:-t${cyc}z} + +# Set PDY +setpdy.sh +sh ./PDY + +export pgmout=OUTPUT.$$ + +export MP_PULSE=0 + +# Set resources to propagate NTASKS across cfp call +NTASKS=${NTASKS:-${npe_node_waveinit}} +export NTASKS=${NTASKS:?NTASKS required to be set} + +# Path to HOME Directory +export CODEwave=${CODEwave:-${HOMEfv3gfs}/WW3} +export EXECwave=${EXECwave:-$HOMEwave/exec} +export FIXwave=${FIXwave:-$HOMEwave/fix/fix_wave_${NET}} +export PARMwave=${PARMwave:-$HOMEwave/parm/wave} +export USHwave=${USHwave:-$HOMEwave/ush} +export EXECcode=${EXECcode:-$CODEwave/exec} + +# Set COM Paths and GETGES environment +export COMINwave=${COMINwave:-${ROTDIR:?}} +export COMOUTwave=${COMOUTwave:-${ROTDIR:?}} +export COMIN=${COMIN:-${COMINwave}/${COMPONENTwave}.${PDY}/${cyc}} +export COMOUT=${COMOUT:-${COMOUTwave}/${COMPONENTwave}.${PDY}/${cyc}} +[[ ! -d $COMOUT ]] && mkdir -m 775 -p $COMOUT + +if [ $SENDCOM = YES ]; then + mkdir -p $COMOUT/rundata +fi + +export wavelog=${COMOUTwave}/wave.log + +# Set mpi serial command +export wavempexec=${wavempexec:-"mpirun -n"} +export wave_mpmd=${wave_mpmd:-"cfp"} + +# Execute the Script +$HOMEwave/scripts/exwave_init.sh + +# Remove temp directories +if [ "$KEEPDATA" != "YES" ]; then + cd $DATAROOT + rm -rf $DATA +fi +date + diff --git a/jobs/JWAVE_POST_SBS b/jobs/JWAVE_POST_SBS new file mode 100755 index 0000000000..6e60567d3f --- /dev/null +++ b/jobs/JWAVE_POST_SBS @@ -0,0 +1,117 @@ +#!/bin/bash + +date +export PS4=' $SECONDS + ' +set -x -e + +############################# +# Source relevant config files +############################# +configs="base wave wavepostsbs" +export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} +config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} +for config in $configs; do + . $config_path/config.$config + status=$? + [[ $status -ne 0 ]] && exit $status +done + +########################################## +# Source machine runtime environment +########################################## +. $HOMEgfs/env/${machine}.env wavepostsbs +status=$? +[[ $status -ne 0 ]] && exit $status + +# PATH for working directory +export NET=${NET:-gfs} +export RUN=${RUN:-gfs} +export COMPONENTwave=${COMPONENTwave:-${RUN}wave} + +export HOMEgefs=${HOMEgefs:-$NWROOT/$NET.${gefs_ver}} +export HOMEgfs=${HOMEgfs:-$NWROOT/$NET.${gfs_ver}} + +# Add default errchk = err_chk +export errchk=${errchk:-err_chk} + +# Set HOMEwave to HOMEgefs +HOMEwave=${HOMEwave:-${HOMEgfs}} + +# Set resources to propagate NTASKS across cfp call +NTASKS=${NTASKS:-${npe_node_waveprep}} +export NTASKS=${NTASKS:?NTASKS required to be set} + +# Create and go to DATA directory +export DATA=${DATA:-${DATAROOT:?}/${jobid}} +mkdir -p $DATA +cd $DATA + +export cyc=${cyc:-00} +export cycle=${cycle:-t${cyc}z} + +# Set PDY +setpdy.sh +sh ./PDY + +export pgmout=OUTPUT.$$ + +export MP_PULSE=0 + +# Path to HOME Directory +export CODEwave=${CODEwave:-${HOMEfv3gfs}/WW3} +export EXECwave=${EXECwave:-$HOMEwave/exec} +export FIXwave=${FIXwave:-$HOMEwave/fix/fix_wave_${NET}} +export PARMwave=${PARMwave:-$HOMEwave/parm/wave} +export USHwave=${USHwave:-$HOMEwave/ush} +export EXECcode=${EXECcode:-$CODEwave/exec} + +# Set COM Paths and GETGES environment +export COMINwave=${COMINwave:-${ROTDIR:?}} +export COMOUTwave=${COMOUTwave:-${ROTDIR:?}} +export COMIN=${COMIN:-${COMINwave}/${COMPONENTwave}.${PDY}/${cyc}} +export COMOUT=${COMOUT:-${COMOUTwave}/${COMPONENTwave}.${PDY}/${cyc}} + +export COMINice=${COMINice:-${COMROOTp2}/omb/prod} +export COMINwnd=${COMINwnd:-${COMROOT}/gfs/prod} +export COMIN_WAV_CUR=${COMIN_WAV_CUR:-${COMROOTp2}/rtofs/prod} + +mkdir -p $COMOUT/gridded +mkdir -p $COMOUT/station +mkdir -p $COMOUT/stats + +export wavelog=${COMOUTwave}/wave.log + +# Set mpi serial command +export wavempexec=${wavempexec:-"mpirun -n"} +export wave_mpmd=${wave_mpmd:-"cfp"} + +env | sort + +# Set wave model ID tag to include member number +# if ensemble; waveMEMB var empty in deterministic +# Set wave model ID tag to include member number +# if ensemble; waveMEMB var empty in deterministic +membTAG='p' +if [ "${waveMEMB}" == "00" ]; then membTAG='c'; fi +export membTAG +export WAV_MOD_TAG=${COMPONENTwave}${waveMEMB} + +export CFP_VERBOSE=1 + +# Execute the Script +$HOMEwave/scripts/exwave_post_sbs.sh +err=$? +if [ $err -ne 0 ]; then + msg="FATAL ERROR: ex-script of GWES_POST failed!" +else + msg="$job completed normally!" +fi +postmsg "$jlogfile" "$msg" + +# Remove temp directories +if [ "$KEEPDATA" != "YES" ]; then + cd $DATAROOT + rm -rf $DATA +fi +date + diff --git a/jobs/JWAVE_PREP b/jobs/JWAVE_PREP new file mode 100755 index 0000000000..dcb7b0df06 --- /dev/null +++ b/jobs/JWAVE_PREP @@ -0,0 +1,126 @@ +#!/bin/bash + +date +export PS4=' $SECONDS + ' +set -x -e + +############################# +# Source relevant config files +############################# +configs="base wave waveprep" +export EXPDIR=${EXPDIR:-$HOMEgfs/parm/config} +config_path=${EXPDIR:-$NWROOT/gfs.${gfs_ver}/parm/config} +for config in $configs; do + . $config_path/config.$config + status=$? + [[ $status -ne 0 ]] && exit $status +done + +########################################## +# Source machine runtime environment +########################################## +. $HOMEgfs/env/${machine}.env waveprep +status=$? +[[ $status -ne 0 ]] && exit $status + +# PATH for working directory +export NET=${NET:-gfs} +export RUN=${RUN:-gfs} +export COMPONENTwave=${COMPONENTwave:-${RUN}wave} + +export HOMEgfs=${HOMEgfs:-$NWROOT/gfs.${gfs_ver}} + +# Add default errchk = err_chk +export errchk=${errchk:-err_chk} + +# Set HOMEwave to HOMEgfs +HOMEwave=${HOMEwave:-${HOMEgfs}} + +# Set resources to propagate NTASKS across cfp call +NTASKS=${NTASKS:-${npe_node_waveprep}} +export NTASKS=${NTASKS:?NTASKS required to be set} + +# Create and go to DATA directory +export DATA=${DATA:-${DATAROOT:?}/${jobid}} +mkdir -p $DATA +cd $DATA + +cyc=${cyc:-00} +export cycle=${cycle:-t${cyc}z} + +# Set PDY +setpdy.sh +sh ./PDY +# Set rtofs PDY +export RPDY=$PDY + +export pgmout=OUTPUT.$$ + +export MP_PULSE=0 + +# CDO required for processing RTOFS currents +# export CDO=${COMROOTp2}/nwprod/rtofs_glo.v1.2.0/bin/cdo +export CDO=/gpfs/dell2/emc/verification/noscrub/Todd.Spindler/CDO/bin/cdo + +# Path to HOME Directory +export CODEwave=${CODEwave:-${HOMEfv3gfs}/WW3} +export EXECwave=${EXECwave:-$HOMEwave/exec} +export FIXwave=${FIXwave:-$HOMEwave/fix/fix_wave_${NET}} +export PARMwave=${PARMwave:-$HOMEwave/parm/wave} +export USHwave=${USHwave:-$HOMEwave/ush} +export EXECcode=${EXECcode:-$CODEwave/exec} + +################################### +# Set COM Paths and GETGES environment +################################### +# Set COM Paths and GETGES environment +export COMINwave=${COMINwave:-${ROTDIR:?}} +export COMOUTwave=${COMOUTwave:-${ROTDIR:?}} +export COMIN=${COMIN:-${COMINwave}/${COMPONENTwave}.${PDY}/${cyc}} +export COMOUT=${COMOUT:-${COMOUTwave}/${COMPONENTwave}.${PDY}/${cyc}} +[[ ! -d $COMOUT ]] && mkdir -m 775 -p $COMOUT + +if [ $RUN_ENVIR = "nco" ]; then + export COMIN_WAV_ICE=${COMIN_WAV_ICE:-$(compath.py gfs/prod)}/${CDUMP}.${PDY}/${cyc} + export COMIN_WAV_WND=${COMIN_WAV_WND:-$(compath.py gfs/prod)}/${CDUMP}.${PDY}/${cyc} + export COMIN_WAV_CUR=${COMIN_WAV_CUR:-$(compath.py ${WAVECUR_DID}/prod)}/${WAVECUR_DID}.${RPDY} + if [ ! -d $COMIN_WAV_CUR ]; then + export RPDY=`$NDATE -24 ${PDY}00 | cut -c1-8` + export COMIN_WAV_CUR=$(compath.py ${WAVECUR_DID}/prod)/${WAVECUR_DID}.${RPDY} + fi +else + if [ ! -d $DMPDIR/${WAVECUR_DID}.${RPDY} ]; then export RPDY=`$NDATE -24 ${PDY}00 | cut -c1-8`; fi + if [ ! -L $ROTDIR/${WAVECUR_DID}.${RPDY} ]; then # Check if symlink already exists in ROTDIR + $NLN $DMPDIR/${WAVECUR_DID}.${RPDY} $ROTDIR/${WAVECUR_DID}.${RPDY} + fi + $NLN $DMPDIR/$CDUMP.${PDY}/$cyc/${WAVICEFILE} $ROTDIR/$CDUMP.${PDY}/$cyc/${WAVICEFILE} + export COMIN_OBS=${COMIN_OBS:-$ROTDIR/$RUN.$PDY/$cyc} + export COMIN_WAV_ICE=${COMIN_OBS} + export COMIN_WAV_WND=${COMIN_OBS} + export COMIN_WAV_CUR=${ROTDIR}/${WAVECUR_DID}.${RPDY} +fi + +if [ $SENDCOM = YES ]; then + mkdir -p $COMOUT +fi + +export wavelog=${COMOUTwave}/wave.log + +# Set mpi serial command +export wavempexec=${wavempexec:-"mpirun -n"} +export wave_mpmd=${wave_mpmd:-"cfp"} + +# Set wave model ID tag to include member number +# if ensemble; waveMEMB var empty in deterministic +export WAV_MOD_TAG=${COMPONENTwave}${waveMEMB} + +# Execute the Script +$HOMEwave/scripts/exwave_prep.sh + +# Remove temp directories +if [ "$KEEPDATA" != "YES" ]; then + cd $DATAROOT + rm -rf $DATA +fi +date + diff --git a/jobs/rocoto/analcalc.sh b/jobs/rocoto/analcalc.sh new file mode 100755 index 0000000000..fd7d8f8916 --- /dev/null +++ b/jobs/rocoto/analcalc.sh @@ -0,0 +1,13 @@ +#!/bin/ksh -x + +############################################################### +# Source FV3GFS workflow modules +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +# Execute the JJOB +$HOMEgfs/jobs/JGLOBAL_ANALCALC +status=$? +exit $status diff --git a/jobs/rocoto/analdiag.sh b/jobs/rocoto/analdiag.sh new file mode 100755 index 0000000000..9d5101c67e --- /dev/null +++ b/jobs/rocoto/analdiag.sh @@ -0,0 +1,13 @@ +#!/bin/ksh -x + +############################################################### +# Source FV3GFS workflow modules +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +# Execute the JJOB +$HOMEgfs/jobs/JGLOBAL_ANALDIAG +status=$? +exit $status diff --git a/jobs/rocoto/arch.sh.orig b/jobs/rocoto/arch.sh.orig new file mode 100755 index 0000000000..6689d30755 --- /dev/null +++ b/jobs/rocoto/arch.sh.orig @@ -0,0 +1,351 @@ +#!/bin/ksh -x + +############################################################### +## Abstract: +## Archive driver script +## RUN_ENVIR : runtime environment (emc | nco) +## HOMEgfs : /full/path/to/workflow +## EXPDIR : /full/path/to/config/files +## CDATE : current analysis date (YYYYMMDDHH) +## CDUMP : cycle name (gdas / gfs) +## PDY : current date (YYYYMMDD) +## cyc : current cycle (HH) +############################################################### + +############################################################### +# Source FV3GFS workflow modules +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +# Source relevant configs +configs="base arch" +for config in $configs; do + . $EXPDIR/config.${config} + status=$? + [[ $status -ne 0 ]] && exit $status +done + +# ICS are restarts and always lag INC by $assim_freq hours +ARCHINC_CYC=$ARCH_CYC +ARCHICS_CYC=$((ARCH_CYC-assim_freq)) +if [ $ARCHICS_CYC -lt 0 ]; then + ARCHICS_CYC=$((ARCHICS_CYC+24)) +fi + +# CURRENT CYCLE +APREFIX="${CDUMP}.t${cyc}z." +ASUFFIX=${ASUFFIX:-$SUFFIX} + +if [ $ASUFFIX = ".nc" ]; then + format="netcdf" +else + format="nemsio" +fi + +# Realtime parallels run GFS MOS on 1 day delay +# If realtime parallel, back up CDATE_MOS one day +CDATE_MOS=$CDATE +if [ $REALTIME = "YES" ]; then + CDATE_MOS=$($NDATE -24 $CDATE) +fi +PDY_MOS=$(echo $CDATE_MOS | cut -c1-8) + +############################################################### +# Archive online for verification and diagnostics +############################################################### + +COMIN="$ROTDIR/$CDUMP.$PDY/$cyc" +cd $COMIN + +[[ ! -d $ARCDIR ]] && mkdir -p $ARCDIR +$NCP ${APREFIX}gsistat $ARCDIR/gsistat.${CDUMP}.${CDATE} +$NCP ${APREFIX}pgrb2.1p00.anl $ARCDIR/pgbanl.${CDUMP}.${CDATE}.grib2 + +# Archive 1 degree forecast GRIB2 files for verification +if [ $CDUMP = "gfs" ]; then + fhmax=$FHMAX_GFS + fhr=0 + while [ $fhr -le $fhmax ]; do + fhr2=$(printf %02i $fhr) + fhr3=$(printf %03i $fhr) + $NCP ${APREFIX}pgrb2.1p00.f$fhr3 $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 + (( fhr = $fhr + $FHOUT_GFS )) + done +fi +if [ $CDUMP = "gdas" ]; then + flist="000 003 006 009" + for fhr in $flist; do + fname=${APREFIX}pgrb2.1p00.f${fhr} + fhr2=$(printf %02i $fhr) + $NCP $fname $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 + done +fi + +if [ -s avno.t${cyc}z.cyclone.trackatcfunix ]; then + PLSOT4=`echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]'` + cat avno.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE + cat avnop.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE +fi + +if [ $CDUMP = "gdas" -a -s gdas.t${cyc}z.cyclone.trackatcfunix ]; then + PLSOT4=`echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]'` + cat gdas.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE + cat gdasp.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE +fi + +if [ $CDUMP = "gfs" ]; then + $NCP storms.gfso.atcf_gen.$CDATE ${ARCDIR}/. + $NCP storms.gfso.atcf_gen.altg.$CDATE ${ARCDIR}/. + $NCP trak.gfso.atcfunix.$CDATE ${ARCDIR}/. + $NCP trak.gfso.atcfunix.altg.$CDATE ${ARCDIR}/. + + cp -rp tracker ${ARCDIR}/tracker.$CDATE +fi + +# Archive atmospheric gaussian gfs forecast files for fit2obs +VFYARC=$ROTDIR/vrfyarch +[[ ! -d $VFYARC ]] && mkdir -p $VFYARC +if [ $CDUMP = "gfs" -a $FITSARC = "YES" ]; then + mkdir -p $VFYARC/${CDUMP}.$PDY/$cyc + fhmax=${FHMAX_FITS:-$FHMAX_GFS} + fhr=0 + while [[ $fhr -le $fhmax ]]; do + fhr3=$(printf %03i $fhr) + sfcfile=${CDUMP}.t${cyc}z.sfcf${fhr3}${ASUFFIX} + sigfile=${CDUMP}.t${cyc}z.atmf${fhr3}${ASUFFIX} + $NCP $sfcfile $VFYARC/${CDUMP}.$PDY/$cyc/ + $NCP $sigfile $VFYARC/${CDUMP}.$PDY/$cyc/ + (( fhr = $fhr + 6 )) + done +fi + + +############################################################### +# Archive data to HPSS +if [ $HPSSARCH = "YES" ]; then +############################################################### + +#--determine when to save ICs for warm start and forecast-only runs +SAVEWARMICA="NO" +SAVEWARMICB="NO" +SAVEFCSTIC="NO" +firstday=$($NDATE +24 $SDATE) +mm=`echo $CDATE|cut -c 5-6` +dd=`echo $CDATE|cut -c 7-8` +nday=$(( (mm-1)*30+dd )) +mod=$(($nday % $ARCH_WARMICFREQ)) +if [ $CDATE -eq $firstday -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi +if [ $CDATE -eq $firstday -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi +if [ $mod -eq 0 -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi +if [ $mod -eq 0 -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi + +if [ $ARCHICS_CYC -eq 18 ]; then + nday1=$((nday+1)) + mod1=$(($nday1 % $ARCH_WARMICFREQ)) + if [ $mod1 -eq 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi + if [ $mod1 -ne 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="NO" ; fi + if [ $CDATE -eq $SDATE -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi +fi + +mod=$(($nday % $ARCH_FCSTICFREQ)) +if [ $mod -eq 0 -o $CDATE -eq $firstday ]; then SAVEFCSTIC="YES" ; fi + + +ARCH_LIST="$COMIN/archlist" +[[ -d $ARCH_LIST ]] && rm -rf $ARCH_LIST +mkdir -p $ARCH_LIST +cd $ARCH_LIST + +$HOMEgfs/ush/hpssarch_gen.sh $CDUMP +status=$? +if [ $status -ne 0 ]; then + echo "$HOMEgfs/ush/hpssarch_gen.sh $CDUMP failed, ABORT!" + exit $status +fi + +cd $ROTDIR + +if [ $CDUMP = "gfs" ]; then + + #for targrp in gfsa gfsb - NOTE - do not check htar error status + for targrp in gfsa gfsb; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + done + + #for targrp in gfs_flux gfs_netcdf/nemsio gfs_pgrb2b; do + if [ ${SAVEFCSTNEMSIO:-"YES"} = "YES" ]; then + for targrp in gfs_flux gfs_${format}a gfs_${format}b gfs_pgrb2b; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE ${targrp}.tar failed" + exit $status + fi + done + fi + + #for targrp in gfswave + if [ $DO_WAVE = "YES" ]; then + for targrp in gfswave; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE ${targrp}.tar failed" + exit $status + fi + done + fi + + #for restarts + if [ $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gfs_restarta.tar `cat $ARCH_LIST/gfs_restarta.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gfs_restarta.tar failed" + exit $status + fi + fi + + #--save mdl gfsmos output from all cycles in the 18Z archive directory + if [ -d gfsmos.$PDY_MOS -a $cyc -eq 18 ]; then + htar -P -cvf $ATARDIR/$CDATE_MOS/gfsmos.tar ./gfsmos.$PDY_MOS + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gfsmos.tar failed" + exit $status + fi + fi + +fi + + +if [ $CDUMP = "gdas" ]; then + + htar -P -cvf $ATARDIR/$CDATE/gdas.tar `cat $ARCH_LIST/gdas.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas.tar failed" + exit $status + fi + + #gdaswave + if [ $DO_WAVE = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdaswave.tar `cat $ARCH_LIST/gdaswave.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdaswave.tar failed" + exit $status + fi + fi + + if [ $SAVEWARMICA = "YES" -o $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdas_restarta.tar `cat $ARCH_LIST/gdas_restarta.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas_restarta.tar failed" + exit $status + fi + if [ $DO_WAVE = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdaswave_restart.tar `cat $ARCH_LIST/gdaswave_restart.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdaswave_restart.tar failed" + exit $status + fi + fi + fi + + if [ $SAVEWARMICB = "YES" -o $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdas_restartb.tar `cat $ARCH_LIST/gdas_restartb.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas_restartb.tar failed" + exit $status + fi + fi + +fi + +############################################################### +fi ##end of HPSS archive +############################################################### + + + +############################################################### +# Clean up previous cycles; various depths +# PRIOR CYCLE: Leave the prior cycle alone +GDATE=$($NDATE -$assim_freq $CDATE) + +# PREVIOUS to the PRIOR CYCLE +GDATE=$($NDATE -$assim_freq $GDATE) +gPDY=$(echo $GDATE | cut -c1-8) +gcyc=$(echo $GDATE | cut -c9-10) + +# Remove the TMPDIR directory +COMIN="$RUNDIR/$GDATE" +[[ -d $COMIN ]] && rm -rf $COMIN + +if [[ "${DELETE_COM_IN_ARCHIVE_JOB:-YES}" == NO ]] ; then + exit 0 +fi + +# Step back every assim_freq hours +# and remove old rotating directories for successful cycles +# defaults from 24h to 120h +DO_GLDAS=${DO_GLDAS:-"NO"} +GDATEEND=$($NDATE -${RMOLDEND:-24} $CDATE) +GDATE=$($NDATE -${RMOLDSTD:-120} $CDATE) +GLDAS_DATE=$($NDATE -96 $CDATE) +while [ $GDATE -le $GDATEEND ]; do + gPDY=$(echo $GDATE | cut -c1-8) + gcyc=$(echo $GDATE | cut -c9-10) + COMIN="$ROTDIR/$CDUMP.$gPDY/$gcyc" + if [ -d $COMIN ]; then + rocotolog="$EXPDIR/logs/${GDATE}.log" + if [ -f $rocotolog ]; then + testend=$(tail -n 1 $rocotolog | grep "This cycle is complete: Success") + rc=$? + if [ $rc -eq 0 ]; then + if [ $CDUMP != "gdas" -o $DO_GLDAS = "NO" -o $GDATE -lt $GLDAS_DATE ]; then + rm -rf $COMIN + else + for file in `ls $COMIN |grep -v sflux |grep -v RESTART`; do + rm -rf $COMIN/$file + done + for file in `ls $COMIN/RESTART |grep -v sfcanl `; do + rm -rf $COMIN/$file + done + fi + fi + fi + fi + + # Remove any empty directories + COMIN="$ROTDIR/$CDUMP.$gPDY" + if [ -d $COMIN ]; then + [[ ! "$(ls -A $COMIN)" ]] && rm -rf $COMIN + fi + + # Remove mdl gfsmos directory + if [ $CDUMP = "gfs" ]; then + COMIN="$ROTDIR/gfsmos.$gPDY" + if [ -d $COMIN -a $GDATE -lt $CDATE_MOS ]; then rm -rf $COMIN ; fi + fi + + GDATE=$($NDATE +$assim_freq $GDATE) +done + +# Remove archived stuff in $VFYARC that are (48+$FHMAX_GFS) hrs behind +# 1. atmospheric gaussian files used for fit2obs +if [ $CDUMP = "gfs" ]; then + GDATE=$($NDATE -$FHMAX_GFS $GDATE) + gPDY=$(echo $GDATE | cut -c1-8) + COMIN="$VFYARC/$CDUMP.$gPDY" + [[ -d $COMIN ]] && rm -rf $COMIN +fi + +############################################################### +exit 0 diff --git a/jobs/rocoto/arch_BACKUP_240511.sh b/jobs/rocoto/arch_BACKUP_240511.sh new file mode 100755 index 0000000000..6689d30755 --- /dev/null +++ b/jobs/rocoto/arch_BACKUP_240511.sh @@ -0,0 +1,351 @@ +#!/bin/ksh -x + +############################################################### +## Abstract: +## Archive driver script +## RUN_ENVIR : runtime environment (emc | nco) +## HOMEgfs : /full/path/to/workflow +## EXPDIR : /full/path/to/config/files +## CDATE : current analysis date (YYYYMMDDHH) +## CDUMP : cycle name (gdas / gfs) +## PDY : current date (YYYYMMDD) +## cyc : current cycle (HH) +############################################################### + +############################################################### +# Source FV3GFS workflow modules +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +# Source relevant configs +configs="base arch" +for config in $configs; do + . $EXPDIR/config.${config} + status=$? + [[ $status -ne 0 ]] && exit $status +done + +# ICS are restarts and always lag INC by $assim_freq hours +ARCHINC_CYC=$ARCH_CYC +ARCHICS_CYC=$((ARCH_CYC-assim_freq)) +if [ $ARCHICS_CYC -lt 0 ]; then + ARCHICS_CYC=$((ARCHICS_CYC+24)) +fi + +# CURRENT CYCLE +APREFIX="${CDUMP}.t${cyc}z." +ASUFFIX=${ASUFFIX:-$SUFFIX} + +if [ $ASUFFIX = ".nc" ]; then + format="netcdf" +else + format="nemsio" +fi + +# Realtime parallels run GFS MOS on 1 day delay +# If realtime parallel, back up CDATE_MOS one day +CDATE_MOS=$CDATE +if [ $REALTIME = "YES" ]; then + CDATE_MOS=$($NDATE -24 $CDATE) +fi +PDY_MOS=$(echo $CDATE_MOS | cut -c1-8) + +############################################################### +# Archive online for verification and diagnostics +############################################################### + +COMIN="$ROTDIR/$CDUMP.$PDY/$cyc" +cd $COMIN + +[[ ! -d $ARCDIR ]] && mkdir -p $ARCDIR +$NCP ${APREFIX}gsistat $ARCDIR/gsistat.${CDUMP}.${CDATE} +$NCP ${APREFIX}pgrb2.1p00.anl $ARCDIR/pgbanl.${CDUMP}.${CDATE}.grib2 + +# Archive 1 degree forecast GRIB2 files for verification +if [ $CDUMP = "gfs" ]; then + fhmax=$FHMAX_GFS + fhr=0 + while [ $fhr -le $fhmax ]; do + fhr2=$(printf %02i $fhr) + fhr3=$(printf %03i $fhr) + $NCP ${APREFIX}pgrb2.1p00.f$fhr3 $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 + (( fhr = $fhr + $FHOUT_GFS )) + done +fi +if [ $CDUMP = "gdas" ]; then + flist="000 003 006 009" + for fhr in $flist; do + fname=${APREFIX}pgrb2.1p00.f${fhr} + fhr2=$(printf %02i $fhr) + $NCP $fname $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 + done +fi + +if [ -s avno.t${cyc}z.cyclone.trackatcfunix ]; then + PLSOT4=`echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]'` + cat avno.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE + cat avnop.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE +fi + +if [ $CDUMP = "gdas" -a -s gdas.t${cyc}z.cyclone.trackatcfunix ]; then + PLSOT4=`echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]'` + cat gdas.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE + cat gdasp.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE +fi + +if [ $CDUMP = "gfs" ]; then + $NCP storms.gfso.atcf_gen.$CDATE ${ARCDIR}/. + $NCP storms.gfso.atcf_gen.altg.$CDATE ${ARCDIR}/. + $NCP trak.gfso.atcfunix.$CDATE ${ARCDIR}/. + $NCP trak.gfso.atcfunix.altg.$CDATE ${ARCDIR}/. + + cp -rp tracker ${ARCDIR}/tracker.$CDATE +fi + +# Archive atmospheric gaussian gfs forecast files for fit2obs +VFYARC=$ROTDIR/vrfyarch +[[ ! -d $VFYARC ]] && mkdir -p $VFYARC +if [ $CDUMP = "gfs" -a $FITSARC = "YES" ]; then + mkdir -p $VFYARC/${CDUMP}.$PDY/$cyc + fhmax=${FHMAX_FITS:-$FHMAX_GFS} + fhr=0 + while [[ $fhr -le $fhmax ]]; do + fhr3=$(printf %03i $fhr) + sfcfile=${CDUMP}.t${cyc}z.sfcf${fhr3}${ASUFFIX} + sigfile=${CDUMP}.t${cyc}z.atmf${fhr3}${ASUFFIX} + $NCP $sfcfile $VFYARC/${CDUMP}.$PDY/$cyc/ + $NCP $sigfile $VFYARC/${CDUMP}.$PDY/$cyc/ + (( fhr = $fhr + 6 )) + done +fi + + +############################################################### +# Archive data to HPSS +if [ $HPSSARCH = "YES" ]; then +############################################################### + +#--determine when to save ICs for warm start and forecast-only runs +SAVEWARMICA="NO" +SAVEWARMICB="NO" +SAVEFCSTIC="NO" +firstday=$($NDATE +24 $SDATE) +mm=`echo $CDATE|cut -c 5-6` +dd=`echo $CDATE|cut -c 7-8` +nday=$(( (mm-1)*30+dd )) +mod=$(($nday % $ARCH_WARMICFREQ)) +if [ $CDATE -eq $firstday -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi +if [ $CDATE -eq $firstday -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi +if [ $mod -eq 0 -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi +if [ $mod -eq 0 -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi + +if [ $ARCHICS_CYC -eq 18 ]; then + nday1=$((nday+1)) + mod1=$(($nday1 % $ARCH_WARMICFREQ)) + if [ $mod1 -eq 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi + if [ $mod1 -ne 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="NO" ; fi + if [ $CDATE -eq $SDATE -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi +fi + +mod=$(($nday % $ARCH_FCSTICFREQ)) +if [ $mod -eq 0 -o $CDATE -eq $firstday ]; then SAVEFCSTIC="YES" ; fi + + +ARCH_LIST="$COMIN/archlist" +[[ -d $ARCH_LIST ]] && rm -rf $ARCH_LIST +mkdir -p $ARCH_LIST +cd $ARCH_LIST + +$HOMEgfs/ush/hpssarch_gen.sh $CDUMP +status=$? +if [ $status -ne 0 ]; then + echo "$HOMEgfs/ush/hpssarch_gen.sh $CDUMP failed, ABORT!" + exit $status +fi + +cd $ROTDIR + +if [ $CDUMP = "gfs" ]; then + + #for targrp in gfsa gfsb - NOTE - do not check htar error status + for targrp in gfsa gfsb; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + done + + #for targrp in gfs_flux gfs_netcdf/nemsio gfs_pgrb2b; do + if [ ${SAVEFCSTNEMSIO:-"YES"} = "YES" ]; then + for targrp in gfs_flux gfs_${format}a gfs_${format}b gfs_pgrb2b; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE ${targrp}.tar failed" + exit $status + fi + done + fi + + #for targrp in gfswave + if [ $DO_WAVE = "YES" ]; then + for targrp in gfswave; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE ${targrp}.tar failed" + exit $status + fi + done + fi + + #for restarts + if [ $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gfs_restarta.tar `cat $ARCH_LIST/gfs_restarta.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gfs_restarta.tar failed" + exit $status + fi + fi + + #--save mdl gfsmos output from all cycles in the 18Z archive directory + if [ -d gfsmos.$PDY_MOS -a $cyc -eq 18 ]; then + htar -P -cvf $ATARDIR/$CDATE_MOS/gfsmos.tar ./gfsmos.$PDY_MOS + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gfsmos.tar failed" + exit $status + fi + fi + +fi + + +if [ $CDUMP = "gdas" ]; then + + htar -P -cvf $ATARDIR/$CDATE/gdas.tar `cat $ARCH_LIST/gdas.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas.tar failed" + exit $status + fi + + #gdaswave + if [ $DO_WAVE = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdaswave.tar `cat $ARCH_LIST/gdaswave.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdaswave.tar failed" + exit $status + fi + fi + + if [ $SAVEWARMICA = "YES" -o $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdas_restarta.tar `cat $ARCH_LIST/gdas_restarta.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas_restarta.tar failed" + exit $status + fi + if [ $DO_WAVE = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdaswave_restart.tar `cat $ARCH_LIST/gdaswave_restart.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdaswave_restart.tar failed" + exit $status + fi + fi + fi + + if [ $SAVEWARMICB = "YES" -o $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdas_restartb.tar `cat $ARCH_LIST/gdas_restartb.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas_restartb.tar failed" + exit $status + fi + fi + +fi + +############################################################### +fi ##end of HPSS archive +############################################################### + + + +############################################################### +# Clean up previous cycles; various depths +# PRIOR CYCLE: Leave the prior cycle alone +GDATE=$($NDATE -$assim_freq $CDATE) + +# PREVIOUS to the PRIOR CYCLE +GDATE=$($NDATE -$assim_freq $GDATE) +gPDY=$(echo $GDATE | cut -c1-8) +gcyc=$(echo $GDATE | cut -c9-10) + +# Remove the TMPDIR directory +COMIN="$RUNDIR/$GDATE" +[[ -d $COMIN ]] && rm -rf $COMIN + +if [[ "${DELETE_COM_IN_ARCHIVE_JOB:-YES}" == NO ]] ; then + exit 0 +fi + +# Step back every assim_freq hours +# and remove old rotating directories for successful cycles +# defaults from 24h to 120h +DO_GLDAS=${DO_GLDAS:-"NO"} +GDATEEND=$($NDATE -${RMOLDEND:-24} $CDATE) +GDATE=$($NDATE -${RMOLDSTD:-120} $CDATE) +GLDAS_DATE=$($NDATE -96 $CDATE) +while [ $GDATE -le $GDATEEND ]; do + gPDY=$(echo $GDATE | cut -c1-8) + gcyc=$(echo $GDATE | cut -c9-10) + COMIN="$ROTDIR/$CDUMP.$gPDY/$gcyc" + if [ -d $COMIN ]; then + rocotolog="$EXPDIR/logs/${GDATE}.log" + if [ -f $rocotolog ]; then + testend=$(tail -n 1 $rocotolog | grep "This cycle is complete: Success") + rc=$? + if [ $rc -eq 0 ]; then + if [ $CDUMP != "gdas" -o $DO_GLDAS = "NO" -o $GDATE -lt $GLDAS_DATE ]; then + rm -rf $COMIN + else + for file in `ls $COMIN |grep -v sflux |grep -v RESTART`; do + rm -rf $COMIN/$file + done + for file in `ls $COMIN/RESTART |grep -v sfcanl `; do + rm -rf $COMIN/$file + done + fi + fi + fi + fi + + # Remove any empty directories + COMIN="$ROTDIR/$CDUMP.$gPDY" + if [ -d $COMIN ]; then + [[ ! "$(ls -A $COMIN)" ]] && rm -rf $COMIN + fi + + # Remove mdl gfsmos directory + if [ $CDUMP = "gfs" ]; then + COMIN="$ROTDIR/gfsmos.$gPDY" + if [ -d $COMIN -a $GDATE -lt $CDATE_MOS ]; then rm -rf $COMIN ; fi + fi + + GDATE=$($NDATE +$assim_freq $GDATE) +done + +# Remove archived stuff in $VFYARC that are (48+$FHMAX_GFS) hrs behind +# 1. atmospheric gaussian files used for fit2obs +if [ $CDUMP = "gfs" ]; then + GDATE=$($NDATE -$FHMAX_GFS $GDATE) + gPDY=$(echo $GDATE | cut -c1-8) + COMIN="$VFYARC/$CDUMP.$gPDY" + [[ -d $COMIN ]] && rm -rf $COMIN +fi + +############################################################### +exit 0 diff --git a/jobs/rocoto/arch_BASE_240511.sh b/jobs/rocoto/arch_BASE_240511.sh new file mode 100644 index 0000000000..02465a50a0 --- /dev/null +++ b/jobs/rocoto/arch_BASE_240511.sh @@ -0,0 +1,318 @@ +#!/bin/ksh -x + +############################################################### +## Abstract: +## Archive driver script +## RUN_ENVIR : runtime environment (emc | nco) +## HOMEgfs : /full/path/to/workflow +## EXPDIR : /full/path/to/config/files +## CDATE : current analysis date (YYYYMMDDHH) +## CDUMP : cycle name (gdas / gfs) +## PDY : current date (YYYYMMDD) +## cyc : current cycle (HH) +############################################################### + +############################################################### +# Source FV3GFS workflow modules +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +# Source relevant configs +configs="base arch" +for config in $configs; do + . $EXPDIR/config.${config} + status=$? + [[ $status -ne 0 ]] && exit $status +done + +# ICS are restarts and always lag INC by $assim_freq hours +ARCHINC_CYC=$ARCH_CYC +ARCHICS_CYC=$((ARCH_CYC-assim_freq)) +if [ $ARCHICS_CYC -lt 0 ]; then + ARCHICS_CYC=$((ARCHICS_CYC+24)) +fi + +# CURRENT CYCLE +APREFIX="${CDUMP}.t${cyc}z." +ASUFFIX=${ASUFFIX:-$SUFFIX} + +if [ $ASUFFIX = ".nc" ]; then + format="netcdf" +else + format="nemsio" +fi + +# Realtime parallels run GFS MOS on 1 day delay +# If realtime parallel, back up CDATE_MOS one day +CDATE_MOS=$CDATE +if [ $REALTIME = "YES" ]; then + CDATE_MOS=$($NDATE -24 $CDATE) +fi +PDY_MOS=$(echo $CDATE_MOS | cut -c1-8) + +############################################################### +# Archive online for verification and diagnostics +############################################################### + +COMIN="$ROTDIR/$CDUMP.$PDY/$cyc" +cd $COMIN + +[[ ! -d $ARCDIR ]] && mkdir -p $ARCDIR +$NCP ${APREFIX}gsistat $ARCDIR/gsistat.${CDUMP}.${CDATE} +$NCP ${APREFIX}pgrb2.1p00.anl $ARCDIR/pgbanl.${CDUMP}.${CDATE}.grib2 + +# Archive 1 degree forecast GRIB2 files for verification +if [ $CDUMP = "gfs" ]; then + fhmax=$FHMAX_GFS + fhr=0 + while [ $fhr -le $fhmax ]; do + fhr2=$(printf %02i $fhr) + fhr3=$(printf %03i $fhr) + $NCP ${APREFIX}pgrb2.1p00.f$fhr3 $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 + (( fhr = $fhr + $FHOUT_GFS )) + done +fi +if [ $CDUMP = "gdas" ]; then + flist="000 003 006 009" + for fhr in $flist; do + fname=${APREFIX}pgrb2.1p00.f${fhr} + fhr2=$(printf %02i $fhr) + $NCP $fname $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 + done +fi + +if [ -s avno.t${cyc}z.cyclone.trackatcfunix ]; then + PLSOT4=`echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]'` + cat avno.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE + cat avnop.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE +fi + +if [ $CDUMP = "gdas" -a -s gdas.t${cyc}z.cyclone.trackatcfunix ]; then + PLSOT4=`echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]'` + cat gdas.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE + cat gdasp.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE +fi + +if [ $CDUMP = "gfs" ]; then + $NCP storms.gfso.atcf_gen.$CDATE ${ARCDIR}/. + $NCP storms.gfso.atcf_gen.altg.$CDATE ${ARCDIR}/. + $NCP trak.gfso.atcfunix.$CDATE ${ARCDIR}/. + $NCP trak.gfso.atcfunix.altg.$CDATE ${ARCDIR}/. + + cp -rp tracker ${ARCDIR}/tracker.$CDATE +fi + +# Archive atmospheric gaussian gfs forecast files for fit2obs +VFYARC=$ROTDIR/vrfyarch +[[ ! -d $VFYARC ]] && mkdir -p $VFYARC +if [ $CDUMP = "gfs" -a $FITSARC = "YES" ]; then + mkdir -p $VFYARC/${CDUMP}.$PDY/$cyc + fhmax=${FHMAX_FITS:-$FHMAX_GFS} + fhr=0 + while [[ $fhr -le $fhmax ]]; do + fhr3=$(printf %03i $fhr) + sfcfile=${CDUMP}.t${cyc}z.sfcf${fhr3}${ASUFFIX} + sigfile=${CDUMP}.t${cyc}z.atmf${fhr3}${ASUFFIX} + $NCP $sfcfile $VFYARC/${CDUMP}.$PDY/$cyc/ + $NCP $sigfile $VFYARC/${CDUMP}.$PDY/$cyc/ + (( fhr = $fhr + 6 )) + done +fi + + +############################################################### +# Archive data to HPSS +if [ $HPSSARCH = "YES" ]; then +############################################################### + +#--determine when to save ICs for warm start and forecat-only runs +SAVEWARMICA="NO" +SAVEWARMICB="NO" +SAVEFCSTIC="NO" +firstday=$($NDATE +24 $SDATE) +mm=`echo $CDATE|cut -c 5-6` +dd=`echo $CDATE|cut -c 7-8` +nday=$(( (mm-1)*30+dd )) +mod=$(($nday % $ARCH_WARMICFREQ)) +if [ $CDATE -eq $firstday -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi +if [ $CDATE -eq $firstday -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi +if [ $mod -eq 0 -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi +if [ $mod -eq 0 -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi + +if [ $ARCHICS_CYC -eq 18 ]; then + nday1=$((nday+1)) + mod1=$(($nday1 % $ARCH_WARMICFREQ)) + if [ $mod1 -eq 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi + if [ $mod1 -ne 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="NO" ; fi + if [ $CDATE -eq $SDATE -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi +fi + +mod=$(($nday % $ARCH_FCSTICFREQ)) +if [ $mod -eq 0 -o $CDATE -eq $firstday ]; then SAVEFCSTIC="YES" ; fi + + +ARCH_LIST="$COMIN/archlist" +[[ -d $ARCH_LIST ]] && rm -rf $ARCH_LIST +mkdir -p $ARCH_LIST +cd $ARCH_LIST + +$HOMEgfs/ush/hpssarch_gen.sh $CDUMP +status=$? +if [ $status -ne 0 ]; then + echo "$HOMEgfs/ush/hpssarch_gen.sh $CDUMP failed, ABORT!" + exit $status +fi + +cd $ROTDIR + +if [ $CDUMP = "gfs" ]; then + + #for targrp in gfsa gfsb - NOTE - do not check htar error status + for targrp in gfsa gfsb; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + done + + #for targrp in gfs_flux gfs_netcdf/nemsio gfs_pgrb2b; do + if [ ${SAVEFCSTNEMSIO:-"YES"} = "YES" ]; then + for targrp in gfs_flux gfs_${format}a gfs_${format}b gfs_pgrb2b; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE ${targrp}.tar failed" + exit $status + fi + done + fi + + if [ $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gfs_restarta.tar `cat $ARCH_LIST/gfs_restarta.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gfs_restarta.tar failed" + exit $status + fi + fi + + #--save mdl gfsmos output from all cycles in the 18Z archive directory + if [ -d gfsmos.$PDY_MOS -a $cyc -eq 18 ]; then + htar -P -cvf $ATARDIR/$CDATE_MOS/gfsmos.tar ./gfsmos.$PDY_MOS + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gfsmos.tar failed" + exit $status + fi + fi + +fi + + +if [ $CDUMP = "gdas" ]; then + + htar -P -cvf $ATARDIR/$CDATE/gdas.tar `cat $ARCH_LIST/gdas.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas.tar failed" + exit $status + fi + + if [ $SAVEWARMICA = "YES" -o $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdas_restarta.tar `cat $ARCH_LIST/gdas_restarta.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas_restarta.tar failed" + exit $status + fi + fi + if [ $SAVEWARMICB = "YES" -o $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdas_restartb.tar `cat $ARCH_LIST/gdas_restartb.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas_restartb.tar failed" + exit $status + fi + fi +fi + +############################################################### +fi ##end of HPSS archive +############################################################### + + + +############################################################### +# Clean up previous cycles; various depths +# PRIOR CYCLE: Leave the prior cycle alone +GDATE=$($NDATE -$assim_freq $CDATE) + +# PREVIOUS to the PRIOR CYCLE +GDATE=$($NDATE -$assim_freq $GDATE) +gPDY=$(echo $GDATE | cut -c1-8) +gcyc=$(echo $GDATE | cut -c9-10) + +# Remove the TMPDIR directory +COMIN="$RUNDIR/$GDATE" +[[ -d $COMIN ]] && rm -rf $COMIN + +if [[ "${DELETE_COM_IN_ARCHIVE_JOB:-YES}" == NO ]] ; then + exit 0 +fi + +# Step back every assim_freq hours +# and remove old rotating directories for successful cycles +# defaults from 24h to 120h +DO_GLDAS=${DO_GLDAS:-"NO"} +GDATEEND=$($NDATE -${RMOLDEND:-24} $CDATE) +GDATE=$($NDATE -${RMOLDSTD:-120} $CDATE) +GLDAS_DATE=$($NDATE -96 $CDATE) +while [ $GDATE -le $GDATEEND ]; do + gPDY=$(echo $GDATE | cut -c1-8) + gcyc=$(echo $GDATE | cut -c9-10) + COMIN="$ROTDIR/$CDUMP.$gPDY/$gcyc" + if [ -d $COMIN ]; then + rocotolog="$EXPDIR/logs/${GDATE}.log" + if [ -f $rocotolog ]; then + testend=$(tail -n 1 $rocotolog | grep "This cycle is complete: Success") + rc=$? + if [ $rc -eq 0 ]; then + if [ $CDUMP != "gdas" -o $DO_GLDAS = "NO" -o $GDATE -lt $GLDAS_DATE ]; then + rm -rf $COMIN + else + for file in `ls $COMIN |grep -v sflux |grep -v RESTART`; do + rm -rf $COMIN/$file + done + for file in `ls $COMIN/RESTART |grep -v sfcanl `; do + rm -rf $COMIN/$file + done + fi + fi + fi + fi + + # Remove any empty directories + COMIN="$ROTDIR/$CDUMP.$gPDY" + if [ -d $COMIN ]; then + [[ ! "$(ls -A $COMIN)" ]] && rm -rf $COMIN + fi + + # Remove mdl gfsmos directory + if [ $CDUMP = "gfs" ]; then + COMIN="$ROTDIR/gfsmos.$gPDY" + if [ -d $COMIN -a $GDATE -lt $CDATE_MOS ]; then rm -rf $COMIN ; fi + fi + + GDATE=$($NDATE +$assim_freq $GDATE) +done + +# Remove archived stuff in $VFYARC that are (48+$FHMAX_GFS) hrs behind +# 1. atmospheric gaussian files used for fit2obs +if [ $CDUMP = "gfs" ]; then + GDATE=$($NDATE -$FHMAX_GFS $GDATE) + gPDY=$(echo $GDATE | cut -c1-8) + COMIN="$VFYARC/$CDUMP.$gPDY" + [[ -d $COMIN ]] && rm -rf $COMIN +fi + +############################################################### +exit 0 diff --git a/jobs/rocoto/arch_LOCAL_240511.sh b/jobs/rocoto/arch_LOCAL_240511.sh new file mode 100644 index 0000000000..b0a8430f22 --- /dev/null +++ b/jobs/rocoto/arch_LOCAL_240511.sh @@ -0,0 +1 @@ +arch_gmtb.sh \ No newline at end of file diff --git a/jobs/rocoto/arch_REMOTE_240511.sh b/jobs/rocoto/arch_REMOTE_240511.sh new file mode 100644 index 0000000000..6689d30755 --- /dev/null +++ b/jobs/rocoto/arch_REMOTE_240511.sh @@ -0,0 +1,351 @@ +#!/bin/ksh -x + +############################################################### +## Abstract: +## Archive driver script +## RUN_ENVIR : runtime environment (emc | nco) +## HOMEgfs : /full/path/to/workflow +## EXPDIR : /full/path/to/config/files +## CDATE : current analysis date (YYYYMMDDHH) +## CDUMP : cycle name (gdas / gfs) +## PDY : current date (YYYYMMDD) +## cyc : current cycle (HH) +############################################################### + +############################################################### +# Source FV3GFS workflow modules +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +# Source relevant configs +configs="base arch" +for config in $configs; do + . $EXPDIR/config.${config} + status=$? + [[ $status -ne 0 ]] && exit $status +done + +# ICS are restarts and always lag INC by $assim_freq hours +ARCHINC_CYC=$ARCH_CYC +ARCHICS_CYC=$((ARCH_CYC-assim_freq)) +if [ $ARCHICS_CYC -lt 0 ]; then + ARCHICS_CYC=$((ARCHICS_CYC+24)) +fi + +# CURRENT CYCLE +APREFIX="${CDUMP}.t${cyc}z." +ASUFFIX=${ASUFFIX:-$SUFFIX} + +if [ $ASUFFIX = ".nc" ]; then + format="netcdf" +else + format="nemsio" +fi + +# Realtime parallels run GFS MOS on 1 day delay +# If realtime parallel, back up CDATE_MOS one day +CDATE_MOS=$CDATE +if [ $REALTIME = "YES" ]; then + CDATE_MOS=$($NDATE -24 $CDATE) +fi +PDY_MOS=$(echo $CDATE_MOS | cut -c1-8) + +############################################################### +# Archive online for verification and diagnostics +############################################################### + +COMIN="$ROTDIR/$CDUMP.$PDY/$cyc" +cd $COMIN + +[[ ! -d $ARCDIR ]] && mkdir -p $ARCDIR +$NCP ${APREFIX}gsistat $ARCDIR/gsistat.${CDUMP}.${CDATE} +$NCP ${APREFIX}pgrb2.1p00.anl $ARCDIR/pgbanl.${CDUMP}.${CDATE}.grib2 + +# Archive 1 degree forecast GRIB2 files for verification +if [ $CDUMP = "gfs" ]; then + fhmax=$FHMAX_GFS + fhr=0 + while [ $fhr -le $fhmax ]; do + fhr2=$(printf %02i $fhr) + fhr3=$(printf %03i $fhr) + $NCP ${APREFIX}pgrb2.1p00.f$fhr3 $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 + (( fhr = $fhr + $FHOUT_GFS )) + done +fi +if [ $CDUMP = "gdas" ]; then + flist="000 003 006 009" + for fhr in $flist; do + fname=${APREFIX}pgrb2.1p00.f${fhr} + fhr2=$(printf %02i $fhr) + $NCP $fname $ARCDIR/pgbf${fhr2}.${CDUMP}.${CDATE}.grib2 + done +fi + +if [ -s avno.t${cyc}z.cyclone.trackatcfunix ]; then + PLSOT4=`echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]'` + cat avno.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE + cat avnop.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE +fi + +if [ $CDUMP = "gdas" -a -s gdas.t${cyc}z.cyclone.trackatcfunix ]; then + PLSOT4=`echo $PSLOT|cut -c 1-4 |tr '[a-z]' '[A-Z]'` + cat gdas.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunix.${CDUMP}.$CDATE + cat gdasp.t${cyc}z.cyclone.trackatcfunix | sed s:AVNO:${PLSOT4}:g > ${ARCDIR}/atcfunixp.${CDUMP}.$CDATE +fi + +if [ $CDUMP = "gfs" ]; then + $NCP storms.gfso.atcf_gen.$CDATE ${ARCDIR}/. + $NCP storms.gfso.atcf_gen.altg.$CDATE ${ARCDIR}/. + $NCP trak.gfso.atcfunix.$CDATE ${ARCDIR}/. + $NCP trak.gfso.atcfunix.altg.$CDATE ${ARCDIR}/. + + cp -rp tracker ${ARCDIR}/tracker.$CDATE +fi + +# Archive atmospheric gaussian gfs forecast files for fit2obs +VFYARC=$ROTDIR/vrfyarch +[[ ! -d $VFYARC ]] && mkdir -p $VFYARC +if [ $CDUMP = "gfs" -a $FITSARC = "YES" ]; then + mkdir -p $VFYARC/${CDUMP}.$PDY/$cyc + fhmax=${FHMAX_FITS:-$FHMAX_GFS} + fhr=0 + while [[ $fhr -le $fhmax ]]; do + fhr3=$(printf %03i $fhr) + sfcfile=${CDUMP}.t${cyc}z.sfcf${fhr3}${ASUFFIX} + sigfile=${CDUMP}.t${cyc}z.atmf${fhr3}${ASUFFIX} + $NCP $sfcfile $VFYARC/${CDUMP}.$PDY/$cyc/ + $NCP $sigfile $VFYARC/${CDUMP}.$PDY/$cyc/ + (( fhr = $fhr + 6 )) + done +fi + + +############################################################### +# Archive data to HPSS +if [ $HPSSARCH = "YES" ]; then +############################################################### + +#--determine when to save ICs for warm start and forecast-only runs +SAVEWARMICA="NO" +SAVEWARMICB="NO" +SAVEFCSTIC="NO" +firstday=$($NDATE +24 $SDATE) +mm=`echo $CDATE|cut -c 5-6` +dd=`echo $CDATE|cut -c 7-8` +nday=$(( (mm-1)*30+dd )) +mod=$(($nday % $ARCH_WARMICFREQ)) +if [ $CDATE -eq $firstday -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi +if [ $CDATE -eq $firstday -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi +if [ $mod -eq 0 -a $cyc -eq $ARCHINC_CYC ]; then SAVEWARMICA="YES" ; fi +if [ $mod -eq 0 -a $cyc -eq $ARCHICS_CYC ]; then SAVEWARMICB="YES" ; fi + +if [ $ARCHICS_CYC -eq 18 ]; then + nday1=$((nday+1)) + mod1=$(($nday1 % $ARCH_WARMICFREQ)) + if [ $mod1 -eq 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi + if [ $mod1 -ne 0 -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="NO" ; fi + if [ $CDATE -eq $SDATE -a $cyc -eq $ARCHICS_CYC ] ; then SAVEWARMICB="YES" ; fi +fi + +mod=$(($nday % $ARCH_FCSTICFREQ)) +if [ $mod -eq 0 -o $CDATE -eq $firstday ]; then SAVEFCSTIC="YES" ; fi + + +ARCH_LIST="$COMIN/archlist" +[[ -d $ARCH_LIST ]] && rm -rf $ARCH_LIST +mkdir -p $ARCH_LIST +cd $ARCH_LIST + +$HOMEgfs/ush/hpssarch_gen.sh $CDUMP +status=$? +if [ $status -ne 0 ]; then + echo "$HOMEgfs/ush/hpssarch_gen.sh $CDUMP failed, ABORT!" + exit $status +fi + +cd $ROTDIR + +if [ $CDUMP = "gfs" ]; then + + #for targrp in gfsa gfsb - NOTE - do not check htar error status + for targrp in gfsa gfsb; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + done + + #for targrp in gfs_flux gfs_netcdf/nemsio gfs_pgrb2b; do + if [ ${SAVEFCSTNEMSIO:-"YES"} = "YES" ]; then + for targrp in gfs_flux gfs_${format}a gfs_${format}b gfs_pgrb2b; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE ${targrp}.tar failed" + exit $status + fi + done + fi + + #for targrp in gfswave + if [ $DO_WAVE = "YES" ]; then + for targrp in gfswave; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE ${targrp}.tar failed" + exit $status + fi + done + fi + + #for restarts + if [ $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gfs_restarta.tar `cat $ARCH_LIST/gfs_restarta.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gfs_restarta.tar failed" + exit $status + fi + fi + + #--save mdl gfsmos output from all cycles in the 18Z archive directory + if [ -d gfsmos.$PDY_MOS -a $cyc -eq 18 ]; then + htar -P -cvf $ATARDIR/$CDATE_MOS/gfsmos.tar ./gfsmos.$PDY_MOS + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gfsmos.tar failed" + exit $status + fi + fi + +fi + + +if [ $CDUMP = "gdas" ]; then + + htar -P -cvf $ATARDIR/$CDATE/gdas.tar `cat $ARCH_LIST/gdas.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas.tar failed" + exit $status + fi + + #gdaswave + if [ $DO_WAVE = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdaswave.tar `cat $ARCH_LIST/gdaswave.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdaswave.tar failed" + exit $status + fi + fi + + if [ $SAVEWARMICA = "YES" -o $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdas_restarta.tar `cat $ARCH_LIST/gdas_restarta.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas_restarta.tar failed" + exit $status + fi + if [ $DO_WAVE = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdaswave_restart.tar `cat $ARCH_LIST/gdaswave_restart.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdaswave_restart.tar failed" + exit $status + fi + fi + fi + + if [ $SAVEWARMICB = "YES" -o $SAVEFCSTIC = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdas_restartb.tar `cat $ARCH_LIST/gdas_restartb.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdas_restartb.tar failed" + exit $status + fi + fi + +fi + +############################################################### +fi ##end of HPSS archive +############################################################### + + + +############################################################### +# Clean up previous cycles; various depths +# PRIOR CYCLE: Leave the prior cycle alone +GDATE=$($NDATE -$assim_freq $CDATE) + +# PREVIOUS to the PRIOR CYCLE +GDATE=$($NDATE -$assim_freq $GDATE) +gPDY=$(echo $GDATE | cut -c1-8) +gcyc=$(echo $GDATE | cut -c9-10) + +# Remove the TMPDIR directory +COMIN="$RUNDIR/$GDATE" +[[ -d $COMIN ]] && rm -rf $COMIN + +if [[ "${DELETE_COM_IN_ARCHIVE_JOB:-YES}" == NO ]] ; then + exit 0 +fi + +# Step back every assim_freq hours +# and remove old rotating directories for successful cycles +# defaults from 24h to 120h +DO_GLDAS=${DO_GLDAS:-"NO"} +GDATEEND=$($NDATE -${RMOLDEND:-24} $CDATE) +GDATE=$($NDATE -${RMOLDSTD:-120} $CDATE) +GLDAS_DATE=$($NDATE -96 $CDATE) +while [ $GDATE -le $GDATEEND ]; do + gPDY=$(echo $GDATE | cut -c1-8) + gcyc=$(echo $GDATE | cut -c9-10) + COMIN="$ROTDIR/$CDUMP.$gPDY/$gcyc" + if [ -d $COMIN ]; then + rocotolog="$EXPDIR/logs/${GDATE}.log" + if [ -f $rocotolog ]; then + testend=$(tail -n 1 $rocotolog | grep "This cycle is complete: Success") + rc=$? + if [ $rc -eq 0 ]; then + if [ $CDUMP != "gdas" -o $DO_GLDAS = "NO" -o $GDATE -lt $GLDAS_DATE ]; then + rm -rf $COMIN + else + for file in `ls $COMIN |grep -v sflux |grep -v RESTART`; do + rm -rf $COMIN/$file + done + for file in `ls $COMIN/RESTART |grep -v sfcanl `; do + rm -rf $COMIN/$file + done + fi + fi + fi + fi + + # Remove any empty directories + COMIN="$ROTDIR/$CDUMP.$gPDY" + if [ -d $COMIN ]; then + [[ ! "$(ls -A $COMIN)" ]] && rm -rf $COMIN + fi + + # Remove mdl gfsmos directory + if [ $CDUMP = "gfs" ]; then + COMIN="$ROTDIR/gfsmos.$gPDY" + if [ -d $COMIN -a $GDATE -lt $CDATE_MOS ]; then rm -rf $COMIN ; fi + fi + + GDATE=$($NDATE +$assim_freq $GDATE) +done + +# Remove archived stuff in $VFYARC that are (48+$FHMAX_GFS) hrs behind +# 1. atmospheric gaussian files used for fit2obs +if [ $CDUMP = "gfs" ]; then + GDATE=$($NDATE -$FHMAX_GFS $GDATE) + gPDY=$(echo $GDATE | cut -c1-8) + COMIN="$VFYARC/$CDUMP.$gPDY" + [[ -d $COMIN ]] && rm -rf $COMIN +fi + +############################################################### +exit 0 diff --git a/jobs/rocoto/arch_emc.sh b/jobs/rocoto/arch_emc.sh index 02465a50a0..6689d30755 100755 --- a/jobs/rocoto/arch_emc.sh +++ b/jobs/rocoto/arch_emc.sh @@ -127,7 +127,7 @@ fi if [ $HPSSARCH = "YES" ]; then ############################################################### -#--determine when to save ICs for warm start and forecat-only runs +#--determine when to save ICs for warm start and forecast-only runs SAVEWARMICA="NO" SAVEWARMICB="NO" SAVEFCSTIC="NO" @@ -185,7 +185,20 @@ if [ $CDUMP = "gfs" ]; then fi done fi - + + #for targrp in gfswave + if [ $DO_WAVE = "YES" ]; then + for targrp in gfswave; do + htar -P -cvf $ATARDIR/$CDATE/${targrp}.tar `cat $ARCH_LIST/${targrp}.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE ${targrp}.tar failed" + exit $status + fi + done + fi + + #for restarts if [ $SAVEFCSTIC = "YES" ]; then htar -P -cvf $ATARDIR/$CDATE/gfs_restarta.tar `cat $ARCH_LIST/gfs_restarta.txt` status=$? @@ -217,6 +230,16 @@ if [ $CDUMP = "gdas" ]; then exit $status fi + #gdaswave + if [ $DO_WAVE = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdaswave.tar `cat $ARCH_LIST/gdaswave.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdaswave.tar failed" + exit $status + fi + fi + if [ $SAVEWARMICA = "YES" -o $SAVEFCSTIC = "YES" ]; then htar -P -cvf $ATARDIR/$CDATE/gdas_restarta.tar `cat $ARCH_LIST/gdas_restarta.txt` status=$? @@ -224,7 +247,16 @@ if [ $CDUMP = "gdas" ]; then echo "HTAR $CDATE gdas_restarta.tar failed" exit $status fi + if [ $DO_WAVE = "YES" ]; then + htar -P -cvf $ATARDIR/$CDATE/gdaswave_restart.tar `cat $ARCH_LIST/gdaswave_restart.txt` + status=$? + if [ $status -ne 0 -a $CDATE -ge $firstday ]; then + echo "HTAR $CDATE gdaswave_restart.tar failed" + exit $status + fi + fi fi + if [ $SAVEWARMICB = "YES" -o $SAVEFCSTIC = "YES" ]; then htar -P -cvf $ATARDIR/$CDATE/gdas_restartb.tar `cat $ARCH_LIST/gdas_restartb.txt` status=$? @@ -233,6 +265,7 @@ if [ $CDUMP = "gdas" ]; then exit $status fi fi + fi ############################################################### diff --git a/jobs/rocoto/ediag.sh b/jobs/rocoto/ediag.sh new file mode 100755 index 0000000000..f53aa6a34a --- /dev/null +++ b/jobs/rocoto/ediag.sh @@ -0,0 +1,13 @@ +#!/bin/ksh -x + +############################################################### +# Source FV3GFS workflow modules +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +# Execute the JJOB +$HOMEgfs/jobs/JGLOBAL_ENKF_ANALDIAG +status=$? +exit $status diff --git a/jobs/rocoto/prep.sh b/jobs/rocoto/prep.sh index b392e71066..8e8037ebd6 100755 --- a/jobs/rocoto/prep.sh +++ b/jobs/rocoto/prep.sh @@ -30,22 +30,22 @@ export COMOUT="$ROTDIR/$CDUMP.$PDY/$cyc" ############################################################### # If ROTDIR_DUMP=YES, copy dump files to rotdir if [ $ROTDIR_DUMP = "YES" ]; then - $HOMEgfs/ush/getdump.sh $CDATE $CDUMP $DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc} $COMOUT - status=$? - [[ $status -ne 0 ]] && exit $status - -# Ensure previous cycle gdas dumps are available (used by cycle & downstream) - GDATE=$($NDATE -$assim_freq $CDATE) - gPDY=$(echo $GDATE | cut -c1-8) - gcyc=$(echo $GDATE | cut -c9-10) - GDUMP=gdas - gCOMOUT="$ROTDIR/$GDUMP.$gPDY/$gcyc" - if [ ! -s $gCOMOUT/$GDUMP.t${gcyc}z.updated.status.tm00.bufr_d ]; then + $HOMEgfs/ush/getdump.sh $CDATE $CDUMP $DMPDIR/${CDUMP}${DUMP_SUFFIX}.${PDY}/${cyc} $COMOUT + status=$? + [[ $status -ne 0 ]] && exit $status + +# Ensure previous cycle gdas dumps are available (used by cycle & downstream) + GDATE=$($NDATE -$assim_freq $CDATE) + gPDY=$(echo $GDATE | cut -c1-8) + gcyc=$(echo $GDATE | cut -c9-10) + GDUMP=gdas + gCOMOUT="$ROTDIR/$GDUMP.$gPDY/$gcyc" + if [ ! -s $gCOMOUT/$GDUMP.t${gcyc}z.updated.status.tm00.bufr_d ]; then $HOMEgfs/ush/getdump.sh $GDATE $GDUMP $DMPDIR/${GDUMP}${DUMP_SUFFIX}.${gPDY}/${gcyc} $gCOMOUT status=$? [[ $status -ne 0 ]] && exit $status - fi - + fi + fi ############################################################### diff --git a/jobs/rocoto/waveinit.sh b/jobs/rocoto/waveinit.sh new file mode 100755 index 0000000000..ce7397c3e7 --- /dev/null +++ b/jobs/rocoto/waveinit.sh @@ -0,0 +1,21 @@ +#!/bin/ksh -x + +############################################################### +echo +echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +echo +echo "=============== START TO RUN WAVE INIT ===============" +# Execute the JJOB +$HOMEgfs/jobs/JWAVE_INIT +status=$? +exit $status + +############################################################### +# Force Exit out cleanly +if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi +exit 0 diff --git a/jobs/rocoto/wavepostsbs.sh b/jobs/rocoto/wavepostsbs.sh new file mode 100755 index 0000000000..751bb9e8c4 --- /dev/null +++ b/jobs/rocoto/wavepostsbs.sh @@ -0,0 +1,21 @@ +#!/bin/ksh -x + +############################################################### +echo +echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +echo +echo "=============== START TO RUN WAVE POST_SBS ===============" +# Execute the JJOB +$HOMEgfs/jobs/JWAVE_POST_SBS +status=$? +exit $status + +############################################################### +# Force Exit out cleanly +if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi +exit 0 diff --git a/jobs/rocoto/waveprep.sh b/jobs/rocoto/waveprep.sh new file mode 100755 index 0000000000..faef5533d9 --- /dev/null +++ b/jobs/rocoto/waveprep.sh @@ -0,0 +1,21 @@ +#!/bin/ksh -x + +############################################################### +echo +echo "=============== START TO SOURCE FV3GFS WORKFLOW MODULES ===============" +. $HOMEgfs/ush/load_fv3gfs_modules.sh +status=$? +[[ $status -ne 0 ]] && exit $status + +############################################################### +echo +echo "=============== START TO RUN WAVE PREP ===============" +# Execute the JJOB +$HOMEgfs/jobs/JWAVE_PREP +status=$? +exit $status + +############################################################### +# Force Exit out cleanly +if [ ${KEEPDATA:-"NO"} = "NO" ] ; then rm -rf $DATAROOT ; fi +exit 0 diff --git a/modulefiles/module_nemsutil.hera b/modulefiles/module_nemsutil.hera deleted file mode 100644 index f1908fdf6e..0000000000 --- a/modulefiles/module_nemsutil.hera +++ /dev/null @@ -1,10 +0,0 @@ -#%Module##################################################### -## Module file for nemsutil -############################################################# - -module use -a /scratch2/NCEPDEV/nwprod/NCEPLIBS/modulefiles -module load w3nco/2.0.6 -module load bacio/2.0.3 -module load nemsio/2.2.3 - -export FCMP=ifort diff --git a/modulefiles/module_nemsutil.wcoss b/modulefiles/module_nemsutil.wcoss deleted file mode 100644 index f421c1a88b..0000000000 --- a/modulefiles/module_nemsutil.wcoss +++ /dev/null @@ -1,13 +0,0 @@ -#%Module##################################################### -## Module file for nemsutil -############################################################# - -# Loading Intel Compiler Suite -module load ics/14.0.1 - -# Loding nceplibs modules -module load w3nco/v2.0.6 -module load bacio/v2.0.1 -module load nemsio/v2.2.1 - -export FCMP=ifort diff --git a/modulefiles/module_nemsutil.wcoss_cray b/modulefiles/module_nemsutil.wcoss_cray deleted file mode 100644 index 371c8e0245..0000000000 --- a/modulefiles/module_nemsutil.wcoss_cray +++ /dev/null @@ -1,17 +0,0 @@ -#%Module##################################################### -## Module file for nemsutil -############################################################# - -module purge -module load modules -module load PrgEnv-intel -module load cray-mpich -module load craype-sandybridge - -module load w3nco-intel/2.0.6 -module load bacio-intel/2.0.1 - -export NEMSIO_INC=/usrx/local/nceplibs/nemsio/nemsio_v2.2.3/incmod -export NEMSIO_LIB=/usrx/local/nceplibs/nemsio/nemsio_v2.2.3/libnemsio_v2.2.3.a - -export FCMP=ftn diff --git a/modulefiles/module_nemsutil.wcoss_cray_userlib b/modulefiles/module_nemsutil.wcoss_cray_userlib deleted file mode 100644 index 53fad475a5..0000000000 --- a/modulefiles/module_nemsutil.wcoss_cray_userlib +++ /dev/null @@ -1,19 +0,0 @@ -#%Module##################################################### -## Module file for nemsutil -############################################################# - -# Load Intel environment -module purge -module load modules -module load PrgEnv-intel -module load cray-mpich -module load craype-sandybridge - -# Load NCEPLIBS modules -module unuse /gpfs/hps/nco/ops/nwprod/lib/modulefiles -module use $MOD_PATH -module load w3nco/v2.0.6 -module load bacio/v2.0.2 -module load nemsio/v2.2.3 - -export FCMP=ftn diff --git a/modulefiles/module_nemsutil.wcoss_dell_p3 b/modulefiles/module_nemsutil.wcoss_dell_p3 deleted file mode 100644 index e93d581651..0000000000 --- a/modulefiles/module_nemsutil.wcoss_dell_p3 +++ /dev/null @@ -1,12 +0,0 @@ -#%Module##################################################### -## Module file for nemsutil -############################################################# - -module load ips/18.0.1.163 -module load impi/18.0.1 - -module load bacio/2.0.2 -module load w3nco/2.0.6 -module load nemsio/2.2.3 - -export FCMP=ifort diff --git a/modulefiles/modulefile.grib_util.wcoss b/modulefiles/modulefile.grib_util.wcoss deleted file mode 100644 index 0ae2e8f49e..0000000000 --- a/modulefiles/modulefile.grib_util.wcoss +++ /dev/null @@ -1,32 +0,0 @@ -#%Module###################################################################### -proc ModulesHelp { } { - puts stderr "Load modules for building GRIB utilities" -} -module-whatis "This module loads the modules and libraries for building\ - the GRIB utilities, including jasper, png, zlib, bacio, g2,\ - w3emc, w3nco, ip, sp, and iobuf." - -conflict build_grib_util - -# -# Loading required system modules -# - module load ics - module switch ics/15.0.6 - module load jasper/v1.900.1 - module load png/v1.2.44 - module load z/v1.2.6 - -# Loading Intel-Compiled NCEP Libraries - module load bacio/v2.0.1 - module load w3emc/v2.2.0 - module load w3nco/v2.0.6 - module load ip/v3.0.0 - module load sp/v2.0.2 - - # pre-implemented g2 v3.1.0 - module use /nwtest2/lib/modulefiles - module load g2/v3.1.0 - -setenv FCMP ifort -setenv CCMP icc diff --git a/modulefiles/modulefile.grib_util.wcoss_cray b/modulefiles/modulefile.grib_util.wcoss_cray deleted file mode 100644 index 191baa15df..0000000000 --- a/modulefiles/modulefile.grib_util.wcoss_cray +++ /dev/null @@ -1,22 +0,0 @@ -#%Module###################################################################### -module unload craype-haswell -module load craype-sandybridge -module unload PrgEnv-cray -module load PrgEnv-intel/5.2.56 -module switch intel/15.0.6.233 -module load iobuf/2.0.7 - -module load bacio-intel/2.0.1 -module load w3emc-intel/2.2.0 -module load w3nco-intel/2.0.6 -module load ip-intel/3.0.0 -module load sp-intel/2.0.2 -module load jasper-gnu-sandybridge/1.900.1 -module load png-intel-sandybridge/1.2.49 -module load zlib-intel-sandybridge/1.2.7 - -module use /gpfs/hps/nco/ops/nwtest/lib/modulefiles -module load g2-intel/3.1.0 - -export FCMP=ftn -export CCMP=cc diff --git a/modulefiles/modulefile.grib_util.wcoss_cray_userlib b/modulefiles/modulefile.grib_util.wcoss_cray_userlib deleted file mode 100644 index 56ebe0c336..0000000000 --- a/modulefiles/modulefile.grib_util.wcoss_cray_userlib +++ /dev/null @@ -1,22 +0,0 @@ -#%Module###################################################################### -module unload craype-haswell -module load craype-sandybridge -module unload PrgEnv-cray -module load PrgEnv-intel/5.2.56 -module switch intel/15.0.6.233 -module load iobuf/2.0.7 - -module unuse /gpfs/hps/nco/ops/nwprod/lib/modulefiles -module use -a $MOD_PATH -module load bacio/v2.0.2 -module load w3emc/v2.2.0 -module load w3nco/v2.0.6 -module load ip/v3.0.0 -module load sp/v2.0.2 -module load jasper/v1.900.1 -module load png/v1.2.44 -module load z/v1.2.6 -module load g2/v3.1.0 - -export FCMP=ftn -export CCMP=cc diff --git a/modulefiles/modulefile.grib_util.wcoss_dell_p3 b/modulefiles/modulefile.grib_util.wcoss_dell_p3 deleted file mode 100644 index bbebec9cbf..0000000000 --- a/modulefiles/modulefile.grib_util.wcoss_dell_p3 +++ /dev/null @@ -1,15 +0,0 @@ -#%Module###################################################################### - -module load bacio/2.0.2 -module load w3emc/2.3.0 -module load w3nco/2.0.6 -module load ip/3.0.1 -module load sp/2.0.2 - -module load jasper/1.900.1 -module load libpng/1.2.59 -module load zlib/1.2.11 -module load g2/3.1.0 - -export FCMP=ifort -export CCMP=icc diff --git a/modulefiles/modulefile.prod_util.wcoss_cray b/modulefiles/modulefile.prod_util.wcoss_cray deleted file mode 100644 index 51031c6bdd..0000000000 --- a/modulefiles/modulefile.prod_util.wcoss_cray +++ /dev/null @@ -1,11 +0,0 @@ -#%Module##################################################### -module purge -module load modules -module load PrgEnv-intel -module load cray-mpich -module load craype-sandybridge - -module load w3nco-intel/2.0.6 - -export FCMP=ftn -export CCMP=cc diff --git a/modulefiles/modulefile.prod_util.wcoss_cray_userlib b/modulefiles/modulefile.prod_util.wcoss_cray_userlib deleted file mode 100644 index dd5209fcf0..0000000000 --- a/modulefiles/modulefile.prod_util.wcoss_cray_userlib +++ /dev/null @@ -1,13 +0,0 @@ -#%Module##################################################### -module purge -module load modules -module load PrgEnv-intel -module load cray-mpich -module load craype-sandybridge - -module unuse /gpfs/hps/nco/ops/nwprod/lib/modulefiles -module use -a $MOD_PATH -module load w3nco/v2.0.6 - -export FCMP=ftn -export CCMP=cc diff --git a/modulefiles/modulefile.prod_util.wcoss_dell_p3 b/modulefiles/modulefile.prod_util.wcoss_dell_p3 deleted file mode 100644 index 9186e13eb8..0000000000 --- a/modulefiles/modulefile.prod_util.wcoss_dell_p3 +++ /dev/null @@ -1,6 +0,0 @@ -#%Module##################################################### - -module load w3nco/2.0.6 - -export FCMP=ifort -export CCMP=icc diff --git a/modulefiles/modulefile.wgrib2.wcoss b/modulefiles/modulefile.wgrib2.wcoss deleted file mode 100644 index 0eea72e391..0000000000 --- a/modulefiles/modulefile.wgrib2.wcoss +++ /dev/null @@ -1,24 +0,0 @@ -#%Module###################################################################### -############################################################# -## Lin.Gan@noaa.gov -## EMC -## wgrib2 v2.0.5 -############################################################# -proc ModulesHelp { } { -puts stderr "Set environment veriables for wgrib2" -puts stderr "This module initializes the users environment" -puts stderr "to build the wgrib2 for WCOSS production.\n" -} -module-whatis "wgrib2" - -set ver v2.0.5 - -module load ics/15.0.6 -module load NetCDF/4.2/serial -module load jasper/v1.900.1 -module load png/v1.2.44 -module load z/v1.2.6 -module load ip/v3.0.0 -module load sp/v2.0.2 -module load g2c/v1.5.0 - diff --git a/modulefiles/modulefile.wgrib2.wcoss_cray b/modulefiles/modulefile.wgrib2.wcoss_cray deleted file mode 100644 index 4d933141ad..0000000000 --- a/modulefiles/modulefile.wgrib2.wcoss_cray +++ /dev/null @@ -1,13 +0,0 @@ -#%Module###################################################################### -module load PrgEnv-gnu/5.2.56 -module load cray-netcdf/4.3.2 -module load craype/2.3.0 -module load craype-sandybridge -module load /gpfs/hps/nco/ops/nwtest/lib/modulefiles/g2c-gnu/1.5.0 - -module load jasper-gnu-sandybridge/1.900.1 -module load png-gnu-sandybridge/1.2.49 -module load zlib-gnu-sandybridge/1.2.7 - -export FCMP=ftn -export CCMP=cc diff --git a/modulefiles/modulefile.wgrib2.wcoss_cray_userlib b/modulefiles/modulefile.wgrib2.wcoss_cray_userlib deleted file mode 100644 index 7b41a17f9f..0000000000 --- a/modulefiles/modulefile.wgrib2.wcoss_cray_userlib +++ /dev/null @@ -1,15 +0,0 @@ -#%Module###################################################################### -module load PrgEnv-gnu/5.2.56 -module load cray-netcdf/4.3.2 -module load craype/2.3.0 -module load craype-sandybridge - -module unuse /gpfs/hps/nco/ops/nwprod/lib/modulefiles -module use -a $MOD_PATH -module load jasper/v1.900.1 -module load png/v1.2.44 -module load z/v1.2.6 -module load g2c/v1.5.0 - -export FCMP=ftn -export CCMP=cc diff --git a/modulefiles/modulefile.wgrib2.wcoss_dell_p3 b/modulefiles/modulefile.wgrib2.wcoss_dell_p3 deleted file mode 100644 index 9a43e3ffd9..0000000000 --- a/modulefiles/modulefile.wgrib2.wcoss_dell_p3 +++ /dev/null @@ -1,11 +0,0 @@ -#%Module###################################################################### - -module load ips/18.0.1.163 - -module load g2c/1.5.0 -module load jasper/1.900.1 -module load libpng/1.2.59 -module load zlib/1.2.11 - -export FCMP=ifort -export CCMP=icc diff --git a/parm/config/config.anal b/parm/config/config.anal index 54ef8a0347..bc66f7f541 100755 --- a/parm/config/config.anal +++ b/parm/config/config.anal @@ -15,7 +15,7 @@ fi if [[ "$CDUMP" = "gfs" ]] ; then export USE_RADSTAT="NO" # This can be only used when bias correction is not-zero. export GENDIAG="NO" - export SETUP='diag_rad=.false.,diag_pcp=.false.,diag_conv=.false.,diag_ozone=.false.,write_diag(3)=.false.,' + export SETUP='diag_rad=.false.,diag_pcp=.false.,diag_conv=.false.,diag_ozone=.false.,write_diag(3)=.false.,niter(2)=100,' export DIAG_TARBALL="NO" fi diff --git a/parm/config/config.analcalc b/parm/config/config.analcalc new file mode 100755 index 0000000000..075ece1de8 --- /dev/null +++ b/parm/config/config.analcalc @@ -0,0 +1,13 @@ +#!/bin/ksh -x + +########## config.analcalc ########## +# GFS post-anal specific (non-diag) + +echo "BEGIN: config.analcalc" + +# Get task specific resources +. $EXPDIR/config.resources analcalc + +export ANALCALCSH=$HOMEgfs/jobs/JGLOBAL_ANALCALC + +echo "END: config.analcalc" diff --git a/parm/config/config.analdiag b/parm/config/config.analdiag new file mode 100755 index 0000000000..023d703f5c --- /dev/null +++ b/parm/config/config.analdiag @@ -0,0 +1,13 @@ +#!/bin/ksh -x + +########## config.analdiag ########## +# GFS post-anal specific (diag) + +echo "BEGIN: config.analdiag" + +# Get task specific resources +. $EXPDIR/config.resources analdiag + +export ANALDIAGSH=$HOMEgfs/jobs/JGLOBAL_ANALDIAG + +echo "END: config.analdiag" diff --git a/parm/config/config.base.emc.dyn b/parm/config/config.base.emc.dyn index 3f91d943ca..6f25989097 100755 --- a/parm/config/config.base.emc.dyn +++ b/parm/config/config.base.emc.dyn @@ -192,6 +192,9 @@ export FHMIN=0 export FHMAX=9 export FHOUT=3 +# Cycle to run EnKF (set to BOTH for both gfs and gdas) +export EUPD_CYC="gdas" + # GFS cycle info export gfs_cyc=@gfs_cyc@ # 0: no GFS cycle, 1: 00Z only, 2: 00Z and 12Z only, 4: all 4 cycles. @@ -231,20 +234,27 @@ fi # IAU related parameters export DOIAU="YES" # Enable 4DIAU for control with 3 increments export IAUFHRS="3,6,9" +export IAU_FHROT=`echo $IAUFHRS | cut -c1` export IAU_DELTHRS=6 export IAU_OFFSET=6 export DOIAU_ENKF="YES" # Enable 4DIAU for EnKF ensemble export IAUFHRS_ENKF="3,6,9" -if [[ "$SDATE" = "$CDATE" ]]; then export IAU_OFFSET=0 ;fi +export IAU_DELTHRS_ENKF=6 +if [[ "$SDATE" = "$CDATE" ]]; then + export IAU_OFFSET=0 + export IAU_FHROT=0 +fi # Use Jacobians in eupd and thereby remove need to run eomg export lobsdiag_forenkf=".true." - # run GLDAS to spin up land ICs export DO_GLDAS=YES export gldas_cyc=00 +# run wave component +export DO_WAVE=YES +export WAVE_CDUMP="gdas" # Microphysics Options: 99-ZhaoCarr, 8-Thompson; 6-WSM6, 10-MG, 11-GFDL export imp_physics=11 @@ -294,6 +304,8 @@ export INCVARS_EFOLD="5" export netcdf_diag=".true." export binary_diag=".false." +# Verification options +export DO_METP="YES" # Run MET+ jobs # Archiving options export HPSSARCH="YES" # save data to HPSS archive diff --git a/parm/config/config.base.nco.static b/parm/config/config.base.nco.static index a01c862df6..183bd9d0b5 100755 --- a/parm/config/config.base.nco.static +++ b/parm/config/config.base.nco.static @@ -170,22 +170,22 @@ export WRITE_DOPOST=".true." # IAU related parameters export DOIAU="NO" export IAUFHRS=6 +export IAU_FHROT=`echo $IAUFHRS | cut -c1` export IAU_DELTHRS=6 export DOIAU_ENKF="NO" export IAUFHRS_ENKF=6 export IAU_DELTHRS_ENKF=6 +if [[ "$SDATE" = "$CDATE" ]]; then + export IAU_OFFSET=0 + export IAU_FHROT=0 +fi # run GLDAS to spin up land ICs export DO_GLDAS=YES export gldas_cyc=00 -# IAU related parameters -export DOIAU="NO" -export IAUFHRS=6 -export IAU_DELTHRS=6 -export DOIAU_ENKF="NO" -export IAUFHRS_ENKF=6 -export IAU_DELTHRS_ENKF=6 +# run wave component +export DO_WAVE=YES # Microphysics Options: 99-ZhaoCarr, 8-Thompson; 6-WSM6, 10-MG, 11-GFDL export imp_physics=11 diff --git a/parm/config/config.ediag b/parm/config/config.ediag new file mode 100755 index 0000000000..8456839e4b --- /dev/null +++ b/parm/config/config.ediag @@ -0,0 +1,13 @@ +#!/bin/ksh -x + +########## config.ediag ########## +# GFS ensemble post-eobs specific + +echo "BEGIN: config.ediag" + +# Get task specific resources +. $EXPDIR/config.resources ediag + +export ANALDIAGSH="$HOMEgfs/scripts/exglobal_analdiag_fv3gfs.sh.ecf" + +echo "END: config.ediag" diff --git a/parm/config/config.efcs b/parm/config/config.efcs index 558a568469..ad6051dd66 100755 --- a/parm/config/config.efcs +++ b/parm/config/config.efcs @@ -54,8 +54,10 @@ fi export restart_interval=${restart_interval:-6} # For IAU, write restarts at beginning of window also -if [ $DOIAU_ENKF = "YES" ]; then export restart_interval="6 -1"; fi - +if [ $DOIAU_ENKF = "YES" ]; then + export restart_interval="6 -1" + if [[ "$SDATE" = "$CDATE" ]]; then export restart_interval="3 -1"; fi +fi export OUTPUT_FILETYPES="$OUTPUT_FILE" if [[ "$OUTPUT_FILE" == "netcdf" ]]; then @@ -82,4 +84,7 @@ if [[ "$OUTPUT_FILE" == "netcdf" ]]; then fi fi +# wave model +export cplwav=.false. + echo "END: config.efcs" diff --git a/parm/config/config.epos b/parm/config/config.epos index cb3c0c7b3d..87ee7d4e35 100755 --- a/parm/config/config.epos +++ b/parm/config/config.epos @@ -16,4 +16,7 @@ if [ $l4densvar = ".false." ]; then export NEPOSGRP=3 fi +# Generate ensemble spread files +export ENKF_SPREAD="YES" + echo "END: config.epos" diff --git a/parm/config/config.fcst b/parm/config/config.fcst index e0640785b3..2759d6baf1 100755 --- a/parm/config/config.fcst +++ b/parm/config/config.fcst @@ -35,6 +35,13 @@ if [ $QUILTING = ".true." ]; then export npe_fcst_gfs=$(echo " $npe_fcst_gfs + $WRITE_GROUP_GFS * $WRTTASK_PER_GROUP_GFS" | bc) fi +if [ $DO_WAVE = "YES" ] ; then + export npe_fcst=$((npe_fcst + npe_wav)) + if [ "$WAVE_CDUMP" = "gfs" -o "$WAVE_CDUMP" = "both" ]; then + export npe_fcst_gfs=$((npe_fcst_gfs + npe_wav)) + fi +fi + # Model configuration export TYPE="nh" export MONO="non-mono" @@ -191,7 +198,16 @@ if [[ "$CDUMP" == "gdas" ]] ; then # GDAS cycle specific parameters export restart_interval=${restart_interval:-6} # For IAU, write restarts at beginning of window also - if [ $DOIAU = "YES" ]; then export restart_interval="6 9"; fi + if [ $DOIAU = "YES" ]; then + export restart_interval="6 9" + if [[ "$SDATE" = "$CDATE" ]]; then export restart_interval="3 6"; fi + fi + + # Choose coupling with wave + if [ $DO_WAVE = YES ]; then export cplwav=.true.; fi + + # Turn on dry mass adjustment in GDAS + export adjust_dry_mass=".true." elif [[ "$CDUMP" == "gfs" ]] ; then # GFS cycle specific parameters @@ -204,6 +220,13 @@ elif [[ "$CDUMP" == "gfs" ]] ; then # GFS cycle specific parameters # Write gfs restart files to rerun fcst from any break point export restart_interval=${restart_interval_gfs:-0} + + # Choose coupling with wave + if [ $DO_WAVE = YES ]; then export cplwav=.true.; fi + + # Turn off dry mass adjustment in GFS + export adjust_dry_mass=".false." + fi diff --git a/parm/config/config.fv3 b/parm/config/config.fv3 index ada4ba4c3e..e9e4dce8b7 100755 --- a/parm/config/config.fv3 +++ b/parm/config/config.fv3 @@ -41,8 +41,8 @@ case $case_in in export layout_y=4 export layout_x_gfs=2 export layout_y_gfs=4 + export npe_wav=70 export nth_fv3=1 - export npe_node_fcst=$(echo "$npe_node_max/$nth_fv3" |bc) export cdmbgwd="0.071,2.1,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export WRITE_GROUP=1 export WRTTASK_PER_GROUP=$npe_node_max @@ -56,8 +56,8 @@ case $case_in in export layout_y=4 export layout_x_gfs=4 export layout_y_gfs=4 + export npe_wav=70 export nth_fv3=1 - export npe_node_fcst=$(echo "$npe_node_max/$nth_fv3" |bc) export cdmbgwd="0.14,1.8,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export WRITE_GROUP=1 export WRTTASK_PER_GROUP=$npe_node_max @@ -71,8 +71,8 @@ case $case_in in export layout_y=6 export layout_x_gfs=4 export layout_y_gfs=6 + export npe_wav=70 export nth_fv3=2 - export npe_node_fcst=$(echo "$npe_node_max/$nth_fv3" |bc) export cdmbgwd="0.23,1.5,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export WRITE_GROUP=1 export WRTTASK_PER_GROUP=$npe_node_max @@ -86,8 +86,8 @@ case $case_in in export layout_y=8 export layout_x_gfs=6 export layout_y_gfs=6 + export npe_wav=270 export nth_fv3=1 - export npe_node_fcst=$(echo "$npe_node_max/$nth_fv3" |bc) export cdmbgwd="1.1,0.72,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export WRITE_GROUP=1 export WRTTASK_PER_GROUP=$npe_node_max @@ -101,10 +101,9 @@ case $case_in in export layout_y=8 #JKH export layout_x_gfs=16 #JKH export layout_y_gfs=12 - export layout_x_gfs=12 #JKH + export npe_wav=270 export layout_y_gfs=8 #JKH export nth_fv3=4 - export npe_node_fcst=$(echo "$npe_node_max/$nth_fv3" |bc) export cdmbgwd="4.0,0.15,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export WRITE_GROUP=2 export WRTTASK_PER_GROUP=$(echo "2*$npe_node_max" |bc) @@ -118,8 +117,8 @@ case $case_in in export layout_y=16 export layout_x_gfs=8 export layout_y_gfs=16 + export npe_wav=270 export nth_fv3=4 - export npe_node_fcst=$(echo "$npe_node_max/$nth_fv3" |bc) export cdmbgwd="4.0,0.10,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export WRITE_GROUP=4 export WRTTASK_PER_GROUP=$(echo "2*$npe_node_max" |bc) @@ -133,8 +132,8 @@ case $case_in in export layout_y=32 export layout_x_gfs=16 export layout_y_gfs=32 + export npe_wav=270 export nth_fv3=4 - export npe_node_fcst=$(echo "$npe_node_max/$nth_fv3" |bc) export cdmbgwd="4.0,0.05,1.0,1.0" # mountain blocking, ogwd, cgwd, cgwd src scaling export WRITE_GROUP=4 export WRTTASK_PER_GROUP=$(echo "3*$npe_node_max" |bc) diff --git a/parm/config/config.resources b/parm/config/config.resources index ab140ab329..42debef85c 100755 --- a/parm/config/config.resources +++ b/parm/config/config.resources @@ -8,8 +8,10 @@ if [ $# -ne 1 ]; then echo "Must specify an input task argument to set resource variables!" echo "argument can be any one of the following:" - echo "anal gldas fcst post vrfy metp arch" - echo "eobs eomg eupd ecen esfc efcs epos earc" + echo "anal analcalc analdiag gldas fcst post vrfy metp arch" + echo "eobs ediag eomg eupd ecen esfc efcs epos earc" + echo "waveinit waveprep wavepostsbs wavegempaksbs waveawipssbs" + echo "wavepost waveawips wavestat" echo "postsnd awips gempak" exit 1 @@ -36,17 +38,103 @@ if [ $step = "prep" -o $step = "prepbufr" ]; then eval "export npe_node_$step=4" eval "export nth_$step=1" +elif [ $step = "waveinit" ]; then + + export wtime_waveinit="00:10:00" + export npe_waveinit=10 + export nth_waveinit=1 + export npe_node_waveinit=$(echo "$npe_node_max / $nth_waveinit" | bc) + export NTASKS=${npe_waveinit} + +elif [ $step = "waveprep" ]; then + + export wtime_waveprep="00:30:00" + export npe_waveprep=115 + export nth_waveprep=1 + export npe_node_waveprep=$(echo "$npe_node_max / $nth_waveprep" | bc) + export NTASKS=${npe_waveprep} + +elif [ $step = "wavepostsbs" ]; then + + export wtime_wavepostsbs="06:00:00" + export npe_wavepostsbs=280 + export nth_wavepostsbs=1 + export npe_node_wavepostsbs=$(echo "$npe_node_max / $nth_wavepostsbs" | bc) + export NTASKS=${npe_wavepostsbs} + +elif [ $step = "wavegempaksbs" ]; then + + export wtime_wavegempaksbs="06:00:00" + export npe_wavegempaksbs=$npe_node_max + export nth_wavegempaksbs=1 + export npe_node_wavegempaksbs=$(echo "$npe_node_max / $nth_wavegempaksbs" | bc) + export NTASKS=${npe_wavegempaksbs} + +elif [ $step = "waveawipssbs" ]; then + + export wtime_waveawipssbs="08:00:00" + export npe_waveawipssbs=$npe_node_max + export nth_waveawipssbs=1 + export npe_node_waveawipssbs=$(echo "$npe_node_max / $nth_waveawipssbs" | bc) + export NTASKS=${npe_waveawipssbs} + +elif [ $step = "wavepost" ]; then + + export wtime_wavepost="01:00:00" + export npe_wavepost=560 + export nth_wavepost=1 + export npe_node_wavepost=$(echo "$npe_node_max / $nth_wavepost" | bc) + export NTASKS=${npe_wavepost} + +elif [ $step = "waveawips" ]; then + + export wtime_waveawips="06:00:00" + export npe_waveawips=$npe_node_max + export nth_waveawips=1 + export npe_node_waveawips=$(echo "$npe_node_max / $nth_waveawips" | bc) + export NTASKS=${npe_waveawips} + +elif [ $step = "wavestat" ]; then + + export wtime_wavestat="01:00:00" + export npe_wavestat=$npe_node_max + export nth_wavestat=1 + export npe_node_wavestat=$(echo "$npe_node_max / $nth_wavestat" | bc) + export NTASKS=${npe_wavestats} + elif [ $step = "anal" ]; then export wtime_anal="02:00:00" export npe_anal=800 - if [ $CASE = "C384" -o $CASE = "C192" -o $CASE = "C96" -o $CASE = "C48" ]; then export npe_anal=84; fi export nth_anal=4 + if [ $CASE = "C384" ]; then + export npe_anal=160 + export nth_anal=10 + fi + if [ $CASE = "C192" -o $CASE = "C96" -o $CASE = "C48" ]; then export npe_anal=84; fi if [[ "$machine" = "WCOSS_DELL_P3" ]]; then export nth_anal=7; fi export npe_node_anal=$(echo "$npe_node_max / $nth_anal" | bc) export nth_cycle=$npe_node_max if [[ "$machine" == "WCOSS_C" ]]; then export memory_anal="3072M"; fi +elif [ $step = "analcalc" ]; then + + export wtime_analcalc="02:00:00" + export npe_analcalc=127 + export nth_analcalc=1 + export npe_node_analcalc=$npe_node_max + if [[ "$machine" = "WCOSS_DELL_P3" ]]; then export npe_analcalc=127 ; fi + if [[ "$machine" == "WCOSS_C" ]]; then export memory_analcalc="3072M"; fi + +elif [ $step = "analdiag" ]; then + + export wtime_analdiag="02:00:00" + export npe_analdiag=60 + export nth_analdiag=1 + export npe_node_analdiag=$npe_node_max + if [[ "$machine" = "WCOSS_DELL_P3" ]]; then export npe_analdiag=56 ; fi + if [[ "$machine" == "WCOSS_C" ]]; then export memory_analdiag="3072M"; fi + elif [ $step = "gldas" ]; then export wtime_gldas="02:00:00" @@ -62,11 +150,11 @@ elif [ $step = "gldas" ]; then elif [ $step = "fcst" ]; then export wtime_fcst="01:00:00" - export wtime_fcst_gfs="8:00:00" + export wtime_fcst_gfs="08:00:00" export npe_fcst=$(echo "$layout_x * $layout_y * 6" | bc) export npe_fcst_gfs=$(echo "$layout_x_gfs * $layout_y_gfs * 6" | bc) export nth_fcst=${nth_fv3:-2} - export npe_node_fcst=${npe_node_fcst:-12} + export npe_node_fcst=$(echo "$npe_node_max / $nth_fcst" | bc) if [[ "$machine" == "WCOSS_C" ]]; then export memory_fcst="1024M"; fi elif [ $step = "post" ]; then @@ -120,6 +208,7 @@ elif [ $step = "arch" -o $step = "earc" -o $step = "getic" ]; then elif [ $step = "eobs" -o $step = "eomg" ]; then + export wtime_eobs="00:30:00" export wtime_eomg="01:00:00" if [ $CASE = "C768" ]; then @@ -136,6 +225,15 @@ elif [ $step = "eobs" -o $step = "eomg" ]; then export npe_node_eobs=$(echo "$npe_node_max / $nth_eobs" | bc) if [[ "$machine" == "WCOSS_C" ]]; then export memory_eobs="3072M"; fi +elif [ $step = "ediag" ]; then + + export wtime_ediag="02:00:00" + export npe_ediag=60 + export nth_ediag=1 + export npe_node_ediag=$npe_node_max + if [[ "$machine" = "WCOSS_DELL_P3" ]]; then export npe_ediag=56 ; fi + if [[ "$machine" == "WCOSS_C" ]]; then export memory_ediag="3072M"; fi + elif [ $step = "eupd" ]; then export wtime_eupd="01:30:00" @@ -146,8 +244,15 @@ elif [ $step = "eupd" ]; then export nth_eupd=9 fi elif [ $CASE = "C384" ]; then - export npe_eupd=84 + export npe_eupd=270 export nth_eupd=2 + if [[ "$machine" = "WCOSS_DELL_P3" ]]; then + export nth_eupd=9 + fi + if [[ "$machine" = "HERA" ]]; then + export npe_eupd=84 + export nth_eupd=10 + fi elif [ $CASE = "C192" -o $CASE = "C96" -o $CASE = "C48" ]; then export npe_eupd=42 export nth_eupd=2 @@ -198,14 +303,14 @@ elif [ $step = "postsnd" ]; then export wtime_postsnd="02:00:00" export npe_postsnd=40 - export nth_postsnd=1 + export nth_postsnd=1 export npe_node_postsnd=4 - export npe_postsndcfp=10 + export npe_postsndcfp=9 export npe_node_postsndcfp=3 - if [ $OUTPUT_FILE == "nemsio" ]; then - export npe_postsnd=13 - export npe_node_postsnd=4 - fi + if [ $OUTPUT_FILE == "nemsio" ]; then + export npe_postsnd=13 + export npe_node_postsnd=4 + fi if [[ "$machine" == "WCOSS_C" ]]; then export memory_postsnd="254M"; fi elif [ $step = "awips" ]; then diff --git a/parm/config/config.wave b/parm/config/config.wave new file mode 100755 index 0000000000..0307a73af5 --- /dev/null +++ b/parm/config/config.wave @@ -0,0 +1,131 @@ +#!/bin/ksh -x + +########## config.wave ########## +# Wave steps specific + +# Parameters that are common to all wave model steps + +# System and version +export wave_sys_ver=v1.0.0 + +# General runtime labels +# export WAV_MOD_ID=${WAV_MOD_ID:-wave} # generic modID=wave valid for GFSv16 and beyond +# COMPONENTwave stands for model component, in addition to NET/RUN for coupled systems +export COMPONENTwave=${COMPONENTwave:-${RUN}wave} + +# In GFS/GDAS, restart files are generated/read from gdas runs +# Can I use rCDUMP here???? +export COMPONENTRSTwave=${COMPONENTRSTwave:-gdaswave} + +# Grids for wave model +export waveGRD='glo_10m aoc_9km ant_9km' +export waveGRDN='1 2 3' # gridnumber for ww3_multi +export waveGRDG='10 20 30' # gridgroup for ww3_multi + +# ESMF input grid +export waveesmfGRD='glox_10m' # input grid + +# Grids for input fields +export WAVEICE_DID=sice +export WAVEICE_FID=icean_5m +export WAVECUR_DID=rtofs +export WAVECUR_FID=rtofs_5m +export WAVEWND_DID= +export WAVEWND_FID= + +# Grids for output fields +export waveuoutpGRD=points +export waveinterpGRD='glo_15mxt' # Grids that need to be interpolated from native + # in POST will generate grib unless gribOK not set +export wavesbsGRD='' # side-by-side grids generated as wave model runs, writes to com +export wavepostGRD='glo_10m aoc_9km ant_9km' # Native grids that will be post-processed (grib2) + +# CDATE +export CDATE=${PDY}${cyc} + +# The start time reflects the number of hindcast hours prior to the cycle initial time +if [ "$CDUMP" = "gdas" ] +then + export FHMAX_WAV=${FHMAX_WAV:-9} +fi +export WAVHINDH=${WAVHINDH:-0} +export FHMIN_WAV=${FHMIN_WAV:-0} +export FHOUT_WAV=${FHOUT_WAV:-3} +export FHMAX_WAV=${FHMAX_WAV:-384} +export FHMAX_HF_WAV=${FHMAX_HF_WAV:-120} +export FHOUT_HF_WAV=${FHOUT_HF_WAV:-1} + +# Output stride +export WAV_WND_HOUR_INC=1 # This value should match with the one used in + # the wind update script +# gridded and point output rate +export DTFLD_WAV=`expr $FHOUT_HF_WAV \* 3600` +export DTPNT_WAV=3600 +export FHINCP_WAV=`expr $DTPNT_WAV / 3600` + +# Selected output parameters (gridded) +export OUTPARS_WAV="WND CUR ICE HS T01 T02 DIR FP DP PHS PTP PDIR CHAR" +# GFS # export OUTPARS_WAV='WND CUR ICE HS T01 T02 DIR FP DP PHS PTP PDIR CHAR' + +# Options for point output (switch on/off boundary point output) +export DOIBP_WAV='NO' + +# Intake currents settings +export WAV_CUR_DT=${WAV_CUR_DT:-3} +export WAV_CUR_HF_DT=${WAV_CUR_HF_DT:-1} +export WAV_CUR_HF_FH=${WAV_CUR_HF_FH:-72} +export WAV_CUR_CDO_SMOOTH="NO" + +# Number of cycles to look back for restart files +export nback= + +# Restart file config +if [ "$CDUMP" = "gdas" ] +then + WAVNCYC=4 + WAVHCYC=6 + FHMAX_WAV_CUR=${FHMAX_WAV_CUR:-48} # RTOFS forecasts only out to 8 days +elif [ ${gfs_cyc} -ne 0 ] +then + FHMAX_WAV_CUR=${FHMAX_WAV_CUR:-192} # RTOFS forecasts only out to 8 days + WAVHCYC=`expr 24 / ${gfs_cyc}` +else + WAVHCYC=0 + FHMAX_WAV_CUR=${FHMAX_WAV_CUR:-192} # RTOFS forecasts only out to 8 days +fi +export FHMAX_WAV_CUR WAVHCYC WAVNCYC + +# Restart timing business +export RSTTYPE_WAV='T' # generate second tier of restart files +export DT_1_RST_WAV=10800 # time between restart files, set to DTRST=1 for a single restart file +export DT_2_RST_WAV=43200 # restart stride for checkpointing restart +export RSTIOFF_WAV=0 # first restart file offset relative to model start +# +# Set runmember to default value if not GEFS cpl run +# (for a GFS coupled run, RUNMEN would be unset, this should default to -1) +export RUNMEM=${RUNMEM:--1} +# Set wave model member tags if ensemble run +# -1: no suffix, deterministic; xxxNN: extract two last digits to make ofilename prefix=gwesNN +if [ $RUNMEM = -1 ]; then +# No suffix added to model ID in case of deterministic run + export waveMEMB= +else +# Extract member number only + export waveMEMB=`echo $RUNMEM | grep -o '..$'` +fi + +# Determine if wave component needs input and/or is coupled +export WW3ATMINP='CPL' +export WW3ICEINP='YES' +export WW3CURINP='YES' + +if [ "${WW3ICEINP}" = "YES" ]; then + export WAVICEFILE=${CDUMP}.t${cyc}z.seaice.5min.grib2 +fi + +# Determine if input is from perturbed ensemble (T) or single input file (F) for all members +export WW3ATMIENS='F' +export WW3ICEIENS='F' +export WW3CURIENS='F' + +echo "END: config.waveprep" diff --git a/parm/config/config.waveinit b/parm/config/config.waveinit new file mode 100755 index 0000000000..93960e5e25 --- /dev/null +++ b/parm/config/config.waveinit @@ -0,0 +1,14 @@ +#!/bin/ksh -x + +########## config.waveinit ########## +# Wave steps specific + +echo "BEGIN: config.waveinit" + +# Get task specific resources +. $EXPDIR/config.resources waveinit + +# Step label +export sigMODE=${sigMODE:-init} + +echo "END: config.waveinit" diff --git a/parm/config/config.wavepostsbs b/parm/config/config.wavepostsbs new file mode 100755 index 0000000000..9bfe255f4b --- /dev/null +++ b/parm/config/config.wavepostsbs @@ -0,0 +1,12 @@ +#!/bin/ksh -x + +########## config.wavepostsbs ########## +# Wave steps specific + +echo "BEGIN: config.wavepostsbs" + +# Get task specific resources +. $EXPDIR/config.resources wavepostsbs + + +echo "END: config.wavepostsbs" diff --git a/parm/config/config.waveprep b/parm/config/config.waveprep new file mode 100755 index 0000000000..50a03969a0 --- /dev/null +++ b/parm/config/config.waveprep @@ -0,0 +1,29 @@ +#!/bin/ksh -x + +########## config.waveprep ########## +# Wave steps specific + +echo "BEGIN: config.waveprep" + +# Get task specific resources +. $EXPDIR/config.resources waveprep + +# Step label +export sigMODE=${sigMODE:-prep} + +export HOUR_INC=3 # This value should match with the one used in + # the wind update script +export GOFILETYPE=1 # GOFILETYPE=1 one gridded file per output step +export POFILETYPE=1 # POFILETYPE=1 one point file per output step + +# Parameters for ww3_multi.inp +# Unified output T or F +export FUNIPNT='T' +# Unified output server type (see ww3_multi.inp in WW3 repo) +export PNTSRV='1' +# Flag for dedicated output process for unified points +export FPNTPROC='T' +# Flag for grids sharing dedicated output processes +export FGRDPROC='F' + +echo "END: config.waveprep" diff --git a/scripts/exgfs_postsnd.sh.ecf b/scripts/exgfs_postsnd.sh.ecf index 4015c4764f..2fc8d91188 100755 --- a/scripts/exgfs_postsnd.sh.ecf +++ b/scripts/exgfs_postsnd.sh.ecf @@ -138,7 +138,7 @@ fi # add appropriate WMO Headers. ######################################## collect=' 1 2 3 4 5 6 7 8 9' -if [ $machine == "JET" ]; then +if [ $machine == "HERA" -o $machine == "JET" ]; then for m in ${collect} do sh $USHbufrsnd/gfs_sndp.sh $m @@ -151,7 +151,6 @@ sh $USHbufrsnd/gfs_bfr2gpk.sh else rm -rf poe_col -echo "sh $USHbufrsnd/gfs_bfr2gpk.sh " >> poe_col for m in ${collect} do echo "sh $USHbufrsnd/gfs_sndp.sh $m " >> poe_col @@ -162,8 +161,9 @@ mv poe_col cmdfile cat cmdfile chmod +x cmdfile -##mpirun -n 10 cfp cmdfile ${APRUN_POSTSNDCFP} cmdfile + +sh $USHbufrsnd/gfs_bfr2gpk.sh fi ################################################ # Convert the bufr soundings into GEMPAK files diff --git a/scripts/exglobal_fcst_nemsfv3gfs.sh b/scripts/exglobal_fcst_nemsfv3gfs.sh index e65a1976aa..f67e4c19db 100755 --- a/scripts/exglobal_fcst_nemsfv3gfs.sh +++ b/scripts/exglobal_fcst_nemsfv3gfs.sh @@ -18,6 +18,8 @@ # 2017-09-13 Fanglin Yang Updated for using GFDL MP and Write Component # 2019-03-05 Rahul Mahajan Implemented IAU # 2019-03-21 Fanglin Yang Add restart capability for running gfs fcst from a break point. +# 2019-12-12 Henrique Alves Added wave model blocks for coupled run +# 2020-01-31 Henrique Alves Added IAU capability for wave component # # $Id$ # @@ -36,7 +38,7 @@ fi machine=${machine:-"WCOSS_C"} machine=$(echo $machine | tr '[a-z]' '[A-Z]') -# Cycling and forecast hour specific parameters +# Cycling and forecast hour specific parameters CASE=${CASE:-C768} CDATE=${CDATE:-2017032500} CDUMP=${CDUMP:-gdas} @@ -99,6 +101,9 @@ FCSTEXEC=${FCSTEXEC:-fv3_gfs.x} PARM_FV3DIAG=${PARM_FV3DIAG:-$HOMEgfs/parm/parm_fv3diag} PARM_POST=${PARM_POST:-$HOMEgfs/parm/post} +# Wave coupling parameter defaults to false +cplwav=${cplwav:-.false.} + # Model config options APRUN_FV3=${APRUN_FV3:-${APRUN_FCST:-${APRUN:-""}}} NTHREADS_FV3=${NTHREADS_FV3:-${NTHREADS_FCST:-${nth_fv3:-1}}} @@ -239,71 +244,71 @@ if [ $warm_start = ".true." -o $RERUN = "YES" ]; then #............................. # Link all (except sfc_data) restart files from $gmemdir - for file in $gmemdir/RESTART/${sPDY}.${scyc}0000.*.nc; do - file2=$(echo $(basename $file)) - file2=$(echo $file2 | cut -d. -f3-) # remove the date from file - fsuf=$(echo $file2 | cut -d. -f1) - if [ $fsuf != "sfc_data" ]; then - $NLN $file $DATA/INPUT/$file2 - fi - done + for file in $(ls $gmemdir/RESTART/${sPDY}.${scyc}0000.*.nc); do + file2=$(echo $(basename $file)) + file2=$(echo $file2 | cut -d. -f3-) # remove the date from file + fsuf=$(echo $file2 | cut -d. -f1) + if [ $fsuf != "sfc_data" ]; then + $NLN $file $DATA/INPUT/$file2 + fi + done # Link sfcanl_data restart files from $memdir - for file in $memdir/RESTART/${sPDY}.${scyc}0000.*.nc; do - file2=$(echo $(basename $file)) - file2=$(echo $file2 | cut -d. -f3-) # remove the date from file - fsufanl=$(echo $file2 | cut -d. -f1) - if [ $fsufanl = "sfcanl_data" ]; then - file2=$(echo $file2 | sed -e "s/sfcanl_data/sfc_data/g") - $NLN $file $DATA/INPUT/$file2 - fi - done + for file in $(ls $memdir/RESTART/${sPDY}.${scyc}0000.*.nc); do + file2=$(echo $(basename $file)) + file2=$(echo $file2 | cut -d. -f3-) # remove the date from file + fsufanl=$(echo $file2 | cut -d. -f1) + if [ $fsufanl = "sfcanl_data" ]; then + file2=$(echo $file2 | sed -e "s/sfcanl_data/sfc_data/g") + $NLN $file $DATA/INPUT/$file2 + fi + done # Need a coupler.res when doing IAU - if [ $DOIAU = "YES" ]; then - rm -f $DATA/INPUT/coupler.res - cat >> $DATA/INPUT/coupler.res << EOF + if [ $DOIAU = "YES" ]; then + rm -f $DATA/INPUT/coupler.res + cat >> $DATA/INPUT/coupler.res << EOF 2 (Calendar: no_calendar=0, thirty_day_months=1, julian=2, gregorian=3, noleap=4) ${gPDY:0:4} ${gPDY:4:2} ${gPDY:6:2} ${gcyc} 0 0 Model start time: year, month, day, hour, minute, second ${sPDY:0:4} ${sPDY:4:2} ${sPDY:6:2} ${scyc} 0 0 Current model time: year, month, day, hour, minute, second EOF - fi + fi # Link increments - if [ $DOIAU = "YES" ]; then - for i in $(echo $IAUFHRS | sed "s/,/ /g" | rev); do - incfhr=$(printf %03i $i) - if [ $incfhr = "006" ]; then - increment_file=$memdir/${CDUMP}.t${cyc}z.atminc.nc - else - increment_file=$memdir/${CDUMP}.t${cyc}z.atmi${incfhr}.nc + if [ $DOIAU = "YES" ]; then + for i in $(echo $IAUFHRS | sed "s/,/ /g" | rev); do + incfhr=$(printf %03i $i) + if [ $incfhr = "006" ]; then + increment_file=$memdir/${CDUMP}.t${cyc}z.atminc.nc + else + increment_file=$memdir/${CDUMP}.t${cyc}z.atmi${incfhr}.nc + fi + if [ ! -f $increment_file ]; then + echo "ERROR: DOIAU = $DOIAU, but missing increment file for fhr $incfhr at $increment_file" + echo "Abort!" + exit 1 + fi + $NLN $increment_file $DATA/INPUT/fv_increment$i.nc + IAU_INC_FILES="'fv_increment$i.nc',$IAU_INC_FILES" + done + read_increment=".false." + res_latlon_dynamics="" + else + increment_file=$memdir/${CDUMP}.t${cyc}z.atminc.nc + if [ -f $increment_file ]; then + $NLN $increment_file $DATA/INPUT/fv_increment.nc + read_increment=".true." + res_latlon_dynamics="fv_increment.nc" fi - if [ ! -f $increment_file ]; then - echo "ERROR: DOIAU = $DOIAU, but missing increment file for fhr $incfhr at $increment_file" - echo "Abort!" - exit 1 - fi - $NLN $increment_file $DATA/INPUT/fv_increment$i.nc - IAU_INC_FILES="'fv_increment$i.nc',$IAU_INC_FILES" - done - read_increment=".false." - res_latlon_dynamics="" - else - increment_file=$memdir/${CDUMP}.t${cyc}z.atminc.nc - if [ -f $increment_file ]; then - $NLN $increment_file $DATA/INPUT/fv_increment.nc - read_increment=".true." - res_latlon_dynamics="fv_increment.nc" fi - fi - + #............................. else ##RERUN export warm_start=".true." PDYT=$(echo $CDATE_RST | cut -c1-8) cyct=$(echo $CDATE_RST | cut -c9-10) - for file in $RSTDIR_TMP/${PDYT}.${cyct}0000.*; do + for file in $(ls $RSTDIR_TMP/${PDYT}.${cyct}0000.*); do file2=$(echo $(basename $file)) file2=$(echo $file2 | cut -d. -f3-) $NLN $file $DATA/INPUT/$file2 @@ -314,7 +319,7 @@ EOF else ## cold start - for file in $memdir/INPUT/*.nc; do + for file in $(ls $memdir/INPUT/*.nc); do file2=$(echo $(basename $file)) fsuf=$(echo $file2 | cut -c1-3) if [ $fsuf = "gfs" -o $fsuf = "sfc" ]; then @@ -380,6 +385,71 @@ if [ $IAER -gt 0 ] ; then done fi +#### Copy over WW3 inputs +if [ $cplwav = ".true." ]; then +# Link WW3 files + for file in $(ls $COMINWW3/${COMPONENTwave}.${PDY}/${cyc}/rundata/rmp_src_to_dst_conserv_*) ; do + $NLN $file $DATA/ + done + $NLN $COMINWW3/${COMPONENTwave}.${PDY}/${cyc}/rundata/ww3_multi.${COMPONENTwave}${WAV_MEMBER}.${cycle}.inp $DATA/ww3_multi.inp + # Check for expected wave grids for this run + array=($WAVECUR_FID $WAVEICE_FID $WAVEWND_FID $waveuoutpGRD $waveGRD $waveesmfGRD $wavesbsGRD $wavepostGRD $waveinterpGRD) + grdALL=`printf "%s\n" "${array[@]}" | sort -u | tr '\n' ' '` + for wavGRD in ${grdALL}; do + # Wave IC (restart) file must exist for warm start on this cycle, if not wave model starts from flat ocean + # For IAU needs to use sPDY for adding IAU backup of 3h + $NLN $COMINWW3/${COMPONENTwave}.${PDY}/${cyc}/rundata/${COMPONENTwave}.mod_def.$wavGRD $DATA/mod_def.$wavGRD + done + # Wave IC (restart) interval assumes 4 daily cycles (restarts only written by gdas cycle) + # WAVCYCH needs to be consistent with restart write interval in ww3_multi.inp or will FAIL + WAVCYCH=${WAVCYCH:-6} + WRDATE=`$NDATE -${WAVCYCH} $CDATE` + WRPDY=`echo $WRDATE | cut -c1-8` + WRcyc=`echo $WRDATE | cut -c9-10` + WRDIR=$COMINWW3/${COMPONENTRSTwave}.${WRPDY}/${WRcyc}/restart + datwave=$COMOUTWW3/${COMPONENTwave}.${PDY}/${cyc}/rundata/ + wavprfx=${COMPONENTwave}${WAV_MEMBER} + for wavGRD in $waveGRD ; do + # Link wave IC for current cycle + $NLN ${WRDIR}/${sPDY}.${scyc}0000.restart.${wavGRD} $DATA/restart.${wavGRD} + eval $NLN $datwave/${wavprfx}.log.${wavGRD}.${PDY}${cyc} log.${wavGRD} + done + if [ "$WW3ICEINP" = "YES" ]; then + $NLN $COMINWW3/${COMPONENTwave}.${PDY}/${cyc}/rundata/${COMPONENTwave}.${WAVEICE_FID}.${cycle}.ice $DATA/ice.${WAVEICE_FID} + fi + if [ "$WW3CURINP" = "YES" ]; then + $NLN $COMINWW3/${COMPONENTwave}.${PDY}/${cyc}/rundata/${COMPONENTwave}.${WAVECUR_FID}.${cycle}.cur $DATA/current.${WAVECUR_FID} + fi +# Link output files + cd $DATA + eval $NLN $datwave/${wavprfx}.log.mww3.${PDY}${cyc} log.mww3 +# Loop for gridded output (uses FHINC) + fhr=$FHMIN_WAV + while [ $fhr -le $FHMAX_WAV ]; do + YMDH=`$NDATE $fhr $CDATE` + YMD=$(echo $YMDH | cut -c1-8) + HMS="$(echo $YMDH | cut -c9-10)0000" + for wavGRD in ${waveGRD} ; do + eval $NLN $datwave/${wavprfx}.out_grd.${wavGRD}.${YMD}.${HMS} ${YMD}.${HMS}.out_grd.${wavGRD} + done + FHINC=$FHOUT_WAV + if [ $FHMAX_HF_WAV -gt 0 -a $FHOUT_HF_WAV -gt 0 -a $fhr -lt $FHMAX_HF_WAV ]; then + FHINC=$FHOUT_HF_WAV + fi + fhr=$((fhr+FHINC)) + done +# Loop for point output (uses DTPNT) + fhr=$FHMIN_WAV + while [ $fhr -le $FHMAX_WAV ]; do + YMDH=`$NDATE $fhr $CDATE` + YMD=$(echo $YMDH | cut -c1-8) + HMS="$(echo $YMDH | cut -c9-10)0000" + eval $NLN $datwave/${wavprfx}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS} ${YMD}.${HMS}.out_pnt.${waveuoutpGRD} + FHINC=$FHINCP_WAV + fhr=$((fhr+FHINC)) + done +fi + # inline post fix files if [ $WRITE_DOPOST = ".true." ]; then $NLN $PARM_POST/post_tag_gfs${LEVS} $DATA/itag @@ -607,6 +677,51 @@ $NCP $FIELD_TABLE field_table #------------------------------------------------------------------ rm -f nems.configure + +if [ $cplwav = ".true." ]; then +#### ww3 version of nems.configure + +# Switch on cpl flag + cpl=.true. + +NTASKS_FV3m1=$((NTASKS_FV3-1)) +atm_petlist_bounds=" 0 $((NTASKS_FV3-1))" +wav_petlist_bounds=" $((NTASKS_FV3)) $((NTASKS_FV3m1+npe_wav))" +### atm_petlist_bounds=" 0 1511" +### atm_petlist_bounds=$atm_petlist_bounds +### wav_petlist_bounds="1512 1691" +### wav_petlist_bounds=$wav_petlist_bounds + coupling_interval_sec=${coupling_interval_sec:-1800} + rm -f nems.configure +cat > nems.configure < WAV + WAV + @ +:: +EOF +else +#### fv3 standalone version of nems.configure cat > nems.configure < model_configure <> input.nml <> $wavelog + err=2;export err;${errchk} + fi + + [[ ! -d $COMOUT/rundata ]] && mkdir -m 775 -p $COMOUT/rundata + echo "$USHwave/wave_grid_moddef.sh $grdID > $grdID.out 2>&1" >> cmdfile + + nmoddef=`expr $nmoddef + 1` + + fi + done + +# 1.a.1 Execute parallel or serialpoe + + if [ "$nmoddef" -gt '0' ] + then + + set +x + echo ' ' + echo " Generating $nmoddef mod def files" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# Set number of processes for mpmd + wavenproc=`wc -l cmdfile | awk '{print $1}'` + wavenproc=`echo $((${wavenproc}<${NTASKS}?${wavenproc}:${NTASKS}))` + +# 1.a.3 Execute the serial or parallel cmdfile + + set +x + echo ' ' + echo " Executing the mod_def command file at : `date`" + echo ' ------------------------------------' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + if [ "$NTASKS" -gt '1' ] + then + ${wavempexec} ${wavenproc} ${wave_mpmd} cmdfile + exit=$? + else + ./cmdfile + exit=$? + fi + + if [ "$exit" != '0' ] + then + set +x + echo ' ' + echo '********************************************' + echo '*** POE FAILURE DURING RAW DATA COPYING ***' + echo '********************************************' + echo ' See Details Below ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + fi + + fi + +# 1.a.3 File check + + for grdID in ${grdALL} + do + if [ -f ${COMOUT}/rundata/${COMPONENTwave}.mod_def.$grdID ] + then + set +x + echo ' ' + echo " mod_def.$grdID succesfully created/copied " + echo ' ' + [[ "$LOUD" = YES ]] && set -x + else + msg="ABNORMAL EXIT: NO MODEL DEFINITION FILE" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '********************************************** ' + echo '*** FATAL ERROR : NO MODEL DEFINITION FILE *** ' + echo '********************************************** ' + echo " grdID = $grdID" + echo ' ' + echo $msg + sed "s/^/$grdID.out : /g" $grdID.out + [[ "$LOUD" = YES ]] && set -x + echo "$COMPONENTwave prep $date $cycle : mod_def.$grdID missing." >> $wavelog + err=3;export err;${errchk} + fi + done + +# --------------------------------------------------------------------------- # +# 2. Ending + + set +x + echo ' ' + echo "Ending at : `date`" + echo ' ' + echo ' *** End of MWW3 Init Config ***' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + msg="$job completed normally" + postmsg "$jlogfile" "$msg" + +# End of MWW3 init config script ------------------------------------------- # diff --git a/scripts/exwave_post_sbs.sh b/scripts/exwave_post_sbs.sh new file mode 100755 index 0000000000..6f1a1dec38 --- /dev/null +++ b/scripts/exwave_post_sbs.sh @@ -0,0 +1,771 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: exwave_post_sbs.sh +# Script description: Creates output products from binary WW3 data +# +# Author: Jose-Henrique Alves Org: NCEP/EMC Date: 2019-12-06 +# Abstract: This script is the postprocessor for the wave component in GFS. +# This version runs side-by-side with the GFS fcst step. +# It executes several scripts forpreparing and creating output data +# as follows: +# +# wave_grib2_sbs.sh : generates GRIB2 files. +# wave_outp_spec.sh : generates spectral data for output locations. +# wave_outp_bull.sh : generates bulletins for output locations. +# wave_grid_interp_ush.sh : interpolates data from new grids to old grids +# wave_tar.sh : tars the spectral and bulletin multiple files +# +# Script history log: +# 2019-12-06 J-Henrique Alves First Version adapted from HTolman post.sh 2007 +# +# $Id$ +# +# Attributes: +# Language: Bourne-again (Bash) Shell +# Machine: WCOSS-DELL-P3 +# +############################################################################### +# +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + set -x + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + cd $DATA + + postmsg "$jlogfile" "HAS BEGUN on `hostname`" + + msg="Starting WAVE POSTPROCESSOR SCRIPT for $WAV_MOD_TAG" + postmsg "$jlogfile" "$msg" + + set +x + echo ' ' + echo ' *********************************' + echo ' *** WAVE POSTPROCESSOR SCRIPT ***' + echo ' *********************************' + echo ' ' + echo "Starting at : `date`" + echo '-------------' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# Script will run only if pre-defined NTASKS +# The actual work is distributed over these tasks. + if [ -z ${NTASKS} ] + then + echo "FATAL ERROR: requires NTASKS to be set " + err=1; export err;${errchk} + exit $err + fi + +# 0.c Defining model grids + + waveuoutpGRD=${waveuoutpGRD:?buoyNotSet} + +# 0.c.1 Grids + + export waveGRD=${waveGRD?Var waveGRD Not Set} + export wavesbsGRD=${wavesbsGRD?Var wavesbsGRD Not Set} + +# 0.c.3 extended global grid and rtma transfer grid + export waveinterpGRD=${waveinterpGRD?Var wavepostGRD Not Set} + export wavepostGRD=${wavepostGRD?Var wavepostGRD Not Set} + +# 0.c.4 Define a temporary directory for storing ascii point output files +# and flush it + + export STA_DIR=$DATA/station_ascii_files + if [ -d $STA_DIR ] + then + rm -rf ${STA_DIR} + fi + mkdir -p ${STA_DIR} + mkdir -p ${STA_DIR}/spec + mkdir -p ${STA_DIR}/ibp + mkdir -p ${STA_DIR}/bull + mkdir -p ${STA_DIR}/cbull + + set +x + echo ' ' + echo 'Grid information :' + echo '-------------------' + echo " Native wave grids : $waveGRD" + echo " Side-by-side grids : $wavesbsGRD" + echo " Interpolated grids : $waveinterpGRD" + echo " Post-process grids : $wavepostGRD" + echo " Output points : $waveuoutpGRD" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + +# --------------------------------------------------------------------------- # +# 1. Get files that are used by most child scripts + + fieldOK='yes' + pointOK='yes' + gribOK='yes' + grintOK='yes' + specOK='yes' + bullOK='yes' + + exit_code=0 + + set +x + echo ' ' + echo 'Preparing input files :' + echo '-----------------------' + [[ "$LOUD" = YES ]] && set -x + +# 1.a Model definition files and output files (set up using poe) + +# 1.a.1 Set up the parallel command tasks + + rm -f cmdfile + touch cmdfile + chmod 744 cmdfile + + [[ "$LOUD" = YES ]] && set -x + +# Copy model definition files + for grdID in $waveGRD $wavesbsGRD $wavepostGRD $waveinterpGRD $waveuoutpGRD + do + if [ -f "$COMIN/rundata/${COMPONENTwave}.mod_def.${grdID}" ] + then + set +x + echo " Mod def file for $grdID found in ${COMIN}/rundata. copying ...." + [[ "$LOUD" = YES ]] && set -x + + cp -f $COMIN/rundata/${COMPONENTwave}.mod_def.${grdID} mod_def.$grdID + iloop=`expr $iloop + 1` + + fi + + done + + for grdID in $waveGRD $wavesbsGRD $wavepostGRD $waveinterpGRD $waveuoutpGRD + do + if [ ! -f mod_def.$grdID ] + then + set +x + echo ' ' + echo '*************************************************** ' + echo " FATAL ERROR : NO MOD_DEF FILE mod_def.$grdID " + echo '*************************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG post $grdID $date $cycle : mod_def file missing." >> $wavelog + postmsg "$jlogfile" "FATAL ERROR : NO MOD_DEF file mod_def.$grdID" + fieldOK='no' + err=2; export err;${errchk} + exit $err + gribOK='no' + else + set +x + echo "File mod_def.$grdID found. Syncing to all nodes ..." + [[ "$LOUD" = YES ]] && set -x + $FSYNC mod_def.$grdID + fi + done + +# 1.c Output locations file + + rm -f buoy.loc + + if [ -f $FIXwave/wave_${NET}.buoys ] + then + cp -f $FIXwave/wave_${NET}.buoys buoy.loc.temp +# Reverse grep to exclude IBP points + sed -n '/^\$.*/!p' buoy.loc.temp | grep -v IBP > buoy.loc +# Grep to include IBP points + sed -n '/^\$.*/!p' buoy.loc.temp | grep IBP > buoy.ibp + rm -f buoy.loc.temp + fi + + if [ -s buoy.loc ] && [ -s buoy.ibp ] + then + set +x + echo " buoy.loc and buoy.ibp copied and processed ($FIXwave/wave_${NET}.buoys)." + [[ "$LOUD" = YES ]] && set -x + else + set +x + echo ' ' + echo '************************************* ' + echo ' FATAL ERROR : NO BUOY LOCATION FILE ' + echo '************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$AV_MOD_ID post $date $cycle : buoy location file missing." >> $wavelog + postmsg "$jlogfile" "FATAL ERROR : NO BUOY LOCATION FILE" + err=3; export err;${errchk} + exit $err + pointOK='no' + specOK='no' + bullOK='no' + fi + +# 1.d Input template files + + if [ "$grintOK" = 'yes' ] + then + for intGRD in $waveinterpGRD + do + if [ -f $FIXwave/${intGRD}_interp.inp.tmpl ] + then + cp -f $FIXwave/${intGRD}_interp.inp.tmpl ${intGRD}_interp.inp.tmpl + fi + + if [ -f ${intGRD}_interp.inp.tmpl ] + then + set +x + echo " ${intGRD}_interp.inp.tmpl copied. Syncing to all nodes ..." + [[ "$LOUD" = YES ]] && set -x + $FSYNC ${intGRD}_interp.inp.tmpl + else + set +x + echo ' ' + echo '*********************************************** ' + echo '*** ERROR : NO TEMPLATE FOR GRINT INPUT FILE *** ' + echo '*********************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG post $date $cycle : GRINT template file missing." >> $wavelog + postmsg "$jlogfile" "NON-FATAL ERROR : NO TEMPLATE FOR GRINT INPUT FILE" + exit_code=1 + grintOK='no' + fi + done + fi + + if [ "$gribOK" = 'yes' ] + then + for grbGRD in $waveinterpGRD $wavepostGRD + do + if [ -f $FIXwave/ww3_grib2.${grbGRD}.inp.tmpl ] + then + cp -f $FIXwave/ww3_grib2.${grbGRD}.inp.tmpl ww3_grib2.${grbGRD}.inp.tmpl + fi + + if [ -f ww3_grib2.${grbGRD}.inp.tmpl ] + then + set +x + echo " ww3_grib2.${grbGRD}.inp.tmpl copied. Syncing to all nodes ..." + [[ "$LOUD" = YES ]] && set -x + $FSYNC ww3_grib2.inp.tmpl + else + set +x + echo ' ' + echo '*********************************************** ' + echo "*** ERROR : NO TEMPLATE FOR ${grbGRD} GRIB INPUT FILE *** " + echo '*********************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG post $date $cycle : GRIB2 template file missing." >> $wavelog + postmsg "$jlogfile" "NON-FATAL ERROR : NO TEMPLATE FOR GRIB2 INPUT FILE" + exit_code=2 + gribOK='no' + fi + done + fi + + if [ -f $FIXwave/ww3_outp_spec.inp.tmpl ] + then + cp -f $FIXwave/ww3_outp_spec.inp.tmpl ww3_outp_spec.inp.tmpl + fi + + if [ -f ww3_outp_spec.inp.tmpl ] + then + set +x + echo " ww3_outp_spec.inp.tmpl copied. Syncing to all grids ..." + [[ "$LOUD" = YES ]] && set -x + $FSYNC ww3_outp_spec.inp.tmpl + else + set +x + echo ' ' + echo '*********************************************** ' + echo '*** ERROR : NO TEMPLATE FOR SPEC INPUT FILE *** ' + echo '*********************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG post $date $cycle : ww3_outp_spec.inp.tmpl file missing." >> $wavelog + postmsg "$jlogfile" "NON-FATAL ERROR : NO TEMPLATE FOR SPEC INPUT FILE" + exit_code=3 + specOK='no' + bullOK='no' + fi + + if [ -f $FIXwave/ww3_outp_bull.inp.tmpl ] + then + cp -f $FIXwave/ww3_outp_bull.inp.tmpl ww3_outp_bull.inp.tmpl + fi + + if [ -f ww3_outp_bull.inp.tmpl ] + then + set +x + echo " ww3_outp_bull.inp.tmpl copied. Syncing to all nodes ..." + [[ "$LOUD" = YES ]] && set -x + $FSYNC ww3_outp_bull.inp.tmpl + else + set +x + echo ' ' + echo '*************************************************** ' + echo '*** ERROR : NO TEMPLATE FOR BULLETIN INPUT FILE *** ' + echo '*************************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG post $date $cycle : bulletin template file missing." >> $wavelog + postmsg "$jlogfile" "NON-FATAL ERROR : NO TEMPLATE FOR BULLETIN INPUT FILE" + exit_code=4 + bullOK='no' + fi + +# 1.e Getting buoy information for points + + if [ "$specOK" = 'yes' ] || [ "$bullOK" = 'yes' ] + then + ymdh=`$NDATE -${WAVHINDH} $CDATE` + tstart="`echo $ymdh | cut -c1-8` `echo $ymdh | cut -c9-10`0000" + dtspec=3600. # default time step (not used here) + sed -e "s/TIME/$tstart/g" \ + -e "s/DT/$dtspec/g" \ + -e "s/POINT/1/g" \ + -e "s/ITYPE/0/g" \ + -e "s/FORMAT/F/g" \ + ww3_outp_spec.inp.tmpl > ww3_outp.inp + + ln -s mod_def.$waveuoutpGRD mod_def.ww3 + fhr=$FHMIN_WAV + YMD=$(echo $CDATE | cut -c1-8) + HMS="$(echo $CDATE | cut -c9-10)0000" + tloop=0 + tloopmax=600 + tsleep=10 + while [ ${tloop} -le ${tloopmax} ] + do + if [ -f $COMIN/rundata/${WAV_MOD_TAG}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS} ] + then + ln -s $COMIN/rundata/${WAV_MOD_TAG}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS} ./out_pnt.${waveuoutpGRD} + break + else + sleep ${tsleep} + tloop=$(($tloop + $tsleep)) + fi + done + + rm -f buoy_tmp.loc buoy_log.ww3 ww3_oup.inp + ln -fs ./out_pnt.${waveuoutpGRD} ./out_pnt.ww3 + ln -fs ./mod_def.${waveuoutpGRD} ./mod_def.ww3 + $EXECcode/ww3_outp > buoy_lst.loc 2>&1 + err=$? + + if [ "$err" != '0' ] && [ ! -f buoy_log.ww3 ] + then + pgm=wave_post + msg="ABNORMAL EXIT: ERROR IN ww3_outp" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '******************************************** ' + echo '*** FATAL ERROR : ERROR IN ww3_outp *** ' + echo '******************************************** ' + echo ' ' + cat buoy_tmp.loc + echo "$WAV_MOD_TAG post $date $cycle : buoy log file failed to be created." >> $wavelog + echo $msg + [[ "$LOUD" = YES ]] && set -x + err=4;export err;${errchk} + specOK='no' + bullOK='no' + exit $err + fi + +# Create new buoy_log.ww3 excluding all IBP files + cat buoy.loc | awk '{print $3}' | sed 's/'\''//g' > ibp_tags + grep -F -f ibp_tags buoy_log.ww3 > buoy_log.tmp + rm -f buoy_log.dat + mv buoy_log.tmp buoy_log.dat + + grep -F -f ibp_tags buoy_lst.loc > buoy_tmp1.loc + sed '$d' buoy_tmp1.loc > buoy_tmp2.loc + buoys=`awk '{ print $1 }' buoy_tmp2.loc` + Nb=`wc buoy_tmp2.loc | awk '{ print $1 }'` + rm -f buoy_tmp1.loc buoy_tmp2.loc + + if [ -s buoy_log.dat ] + then + set +x + echo 'Buoy log file created. Syncing to all nodes ...' + $FSYNC buoy_log.dat + [[ "$LOUD" = YES ]] && set -x + else + set +x + echo ' ' + echo '**************************************** ' + echo '*** ERROR : NO BUOY LOG FILE CREATED *** ' + echo '**************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG post $date $cycle : buoy log file missing." >> $wavelog + postmsg "$jlogfile" "FATAL ERROR : NO BUOY LOG FILE GENERATED FOR SPEC AND BULLETIN FILES" + err=5;export err;${errchk} + specOK='no' + bullOK='no' + OspecOK='no' + ObullOK='no' + fi + +# Create new buoy_log.ww3 including all IBP files + ibspecOK='yes' + cat buoy.ibp | awk '{print $3}' | sed 's/'\''//g' > ibp_tags + grep -F -f ibp_tags buoy_log.ww3 > buoy_log.tmp + rm -f buoy_log.ibp + mv buoy_log.tmp buoy_log.ibp + + grep -F -f ibp_tags buoy_lst.loc > buoy_tmp1.loc + sed '$d' buoy_tmp1.loc > buoy_tmp2.loc + ibpoints=`awk '{ print $1 }' buoy_tmp2.loc` + Nibp=`wc buoy_tmp2.loc | awk '{ print $1 }'` + rm -f buoy_tmp1.loc buoy_tmp2.loc + + if [ -s buoy_log.ibp ] + then + set +x + echo 'IBP log file created. Syncing to all nodes ...' + $FSYNC buoy_log.ibp + [[ "$LOUD" = YES ]] && set -x + else + set +x + echo ' ' + echo '**************************************** ' + echo '*** ERROR : NO IBP LOG FILE CREATED *** ' + echo '**************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG post $date $cycle : ibp log file missing." >> $wavelog + postmsg "$jlogfile" "FATAL ERROR : NO IBP LOG FILE GENERATED FOR SPEC AND BULLETIN FILES" + err=6;export err;${errchk} + ibspecOK='no' + fi + + fi + +# 1.f Data summary + + set +x + echo ' ' + echo " Input files read and processed at : `date`" + echo ' ' + echo ' Data summary : ' + echo ' ---------------------------------------------' + echo " Sufficient data for GRID interpolation : $grintOK" + echo " Sufficient data for GRIB files : $gribOK" + echo " Sufficient data for spectral files : $specOK ($Nb points)" + echo " Sufficient data for bulletins : $bullOK ($Nb points)" + echo " Sufficient data for Input Boundary Points : $ibspecOK ($Nibp points)" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# --------------------------------------------------------------------------- # +# 2. Make consolidated grib2 file for side-by-side grids and interpolate +# onto extended grids +# +# 2.a Command file set-up + + set +x + echo ' Making command file for sbs grib2 and GRID Interpolation ' + [[ "$LOUD" = YES ]] && set -x + + rm -f cmdfile + touch cmdfile + chmod 744 cmdfile + +# 1.a.2 Loop over forecast time to generate post files +# When executed side-by-side, serial mode (cfp when run after the fcst step) + fhr=$FHMIN_WAV + fhrp=$fhr + fhrg=$fhr + iwaitmax=120 # Maximum loop cycles for waiting until wave component output file is ready (fails after max) + while [ $fhr -le $FHMAX_WAV ]; do + + ymdh=`$NDATE $fhr $CDATE` + YMD=$(echo $ymdh | cut -c1-8) + HMS="$(echo $ymdh | cut -c9-10)0000" + YMDHMS=${YMD}${HMS} + FH3=$(printf %03i $fhr) + + fcmdnow=cmdfile.${FH3} + fcmdigrd=icmdfile.${FH3} + fcmdpnt=pcmdfile.${FH3} + fcmdibp=ibpcmdfile.${FH3} + rm -f ${fcmdnow} ${fcmdigrd} ${fcmdpnt} ${fcmdibp} + touch ${fcmdnow} ${fcmdigrd} ${fcmdpnt} ${fcmdibp} +# echo "mkdir output_$YMDHMS" >> ${fcmdnow} + mkdir output_$YMDHMS +# echo "cd output_$YMDHMS" >> ${fcmdnow} + cd output_$YMDHMS +# Create instances of directories for spec and gridded output + export SPECDATA=${DATA}/output_$YMDHMS + export BULLDATA=${DATA}/output_$YMDHMS + export GRIBDATA=${DATA}/output_$YMDHMS + export GRDIDATA=${DATA}/output_$YMDHMS + ln -fs $DATA/mod_def.${waveuoutpGRD} mod_def.ww3 + + if [ $fhr = $fhrp ] + then + iwait=0 + pfile=$COMIN/rundata/${WAV_MOD_TAG}.out_pnt.${waveuoutpGRD}.${YMD}.${HMS} + while [ ! -s ${pfile} ]; do sleep 10; ((iwait++)) && ((iwait==$iwaitmax)) && break ; echo $iwait; done + if [ $iwait -eq $iwaitmax ]; then + echo " FATAL ERROR : NO RAW POINT OUTPUT FILE out_pnt.$waveuoutpGRD + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG post $waveuoutpGRD $date $cycle : point output missing." >> $wavelog + postmsg "$jlogfile" "FATAL ERROR : NO RAW POINT OUTPUT FILE out_pnt.$waveuoutpGRD + err=6; export err;${errchk} + exit $err + fi + ln -fs ${pfile} ./out_pnt.${waveuoutpGRD} + + if [ "$specOK" = 'yes' ] + then + export dtspec=3600. + for buoy in $buoys + do + echo "$USHwave/wave_outp_spec.sh $buoy $ymdh spec > spec_$buoy.out 2>&1" >> ${fcmdnow} + done + fi + + if [ "$ibspecOK" = 'yes' ] && [ "$DOIBP_WAV" = "YES" ] + then + export dtspec=3600. + for buoy in $ibpoints + do + echo "$USHwave/wave_outp_spec.sh $buoy $ymdh ibp > ibp_$buoy.out 2>&1" >> ${fcmdnow} + done + fi + + if [ "$bullOK" = 'yes' ] + then + export dtspec=3600. + for buoy in $buoys + do + echo "$USHwave/wave_outp_spec.sh $buoy $ymdh bull > bull_$buoy.out 2>&1" >> ${fcmdnow} + done + fi + + fi + + if [ $fhr = $fhrg ] + then + for wavGRD in ${waveGRD} ; do + gfile=$COMIN/rundata/${WAV_MOD_TAG}.out_grd.${wavGRD}.${YMD}.${HMS} + while [ ! -s ${gfile} ]; do sleep 10; done + if [ $iwait -eq $iwaitmax ]; then + echo '*************************************************** ' + echo " FATAL ERROR : NO RAW FIELD OUTPUT FILE out_grd.$grdID " + echo '*************************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG post $grdID $date $cycle : field output missing." >> $wavelog + postmsg "$jlogfile" "NON-FATAL ERROR : NO RAW FIELD OUTPUT FILE out_grd.$grdID" + fieldOK='no' + err=7; export err;${errchk} + exit $err + fi + ln -s ${gfile} ./out_grd.${wavGRD} + done + + if [ "$grintOK" = 'yes' ] + then + nigrd=1 + for grdID in $waveinterpGRD + do + case $grdID in + glo_15mxt) ymdh_int=`$NDATE -${WAVHINDH} $ymdh`; dt_int=3600.; n_int=9999 ;; + glo_30mxt) ymdh_int=`$NDATE -${WAVHINDH} $ymdh`; dt_int=3600.; n_int=9999 ;; + esac + echo "$USHwave/wave_grid_interp_sbs.sh $grdID $ymdh_int $dt_int $n_int > grint_$grdID.out 2>&1" >> ${fcmdigrd}.${nigrd} + if [ "$gribOK" = 'yes' ] + then + gribFL=\'`echo ${OUTPARS_WAV}`\' + case $grdID in + glo_15mxt) GRDNAME='global' ; GRDRES=0p25 ; GRIDNR=255 ; MODNR=11 ;; + glo_30mxt) GRDNAME='global' ; GRDRES=0p50 ; GRIDNR=255 ; MODNR=11 ;; + esac + echo "$USHwave/wave_grib2_sbs.sh $grdID $GRIDNR $MODNR $ymdh $fhr $GRDNAME $GRDRES $gribFL > grib_$grdID.out 2>&1" >> ${fcmdigrd}.${nigrd} + fi + echo "${fcmdigrd}.${nigrd}" >> ${fcmdnow} + chmod 744 ${fcmdigrd}.${nigrd} + nigrd=$((nigrd+1)) + done + fi + + if [ "$gribOK" = 'yes' ] + then + for grdID in ${wavepostGRD} # First concatenate grib files for sbs grids + do + gribFL=\'`echo ${OUTPARS_WAV}`\' + case $grdID in + aoc_9km) GRDNAME='arctic' ; GRDRES=9km ; GRIDNR=255 ; MODNR=11 ;; + ant_9km) GRDNAME='antarc' ; GRDRES=9km ; GRIDNR=255 ; MODNR=11 ;; + glo_10m) GRDNAME='global' ; GRDRES=0p16 ; GRIDNR=255 ; MODNR=11 ;; + glo_15m) GRDNAME='global' ; GRDRES=0p25 ; GRIDNR=255 ; MODNR=11 ;; + ao_20m) GRDNAME='arctic' ; GRDRES=0p33 ; GRIDNR=255 ; MODNR=11 ;; + so_20m) GRDNAME='antarc' ; GRDRES=0p33 ; GRIDNR=255 ; MODNR=11 ;; + glo_15mxt) GRDNAME='global' ; GRDRES=0p25 ; GRIDNR=255 ; MODNR=11 ;; + esac + echo "$USHwave/wave_grib2_sbs.sh $grdID $GRIDNR $MODNR $ymdh $fhr $GRDNAME $GRDRES $gribFL > grib_$grdID.out 2>&1" >> ${fcmdnow} + done + fi + + fi + + wavenproc=`wc -l ${fcmdnow} | awk '{print $1}'` + wavenproc=`echo $((${wavenproc}<${NTASKS}?${wavenproc}:${NTASKS}))` + + set +x + echo ' ' + echo " Executing the copy command file at : `date`" + echo ' ------------------------------------' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + if [ "$wavenproc" -gt '1' ] + then + ${wavempexec} ${wavenproc} ${wave_mpmd} ${fcmdnow} + exit=$? + else + chmod 744 ${fcmdnow} + ./${fcmdnow} + exit=$? + fi + + if [ "$exit" != '0' ] + then + set +x + echo ' ' + echo '********************************************' + echo '*** CMDFILE FAILED ***' + echo '********************************************' + echo ' See Details Below ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + err=8; export err;${errchk} + exit $err + fi + + rm -f out_grd.* # Remove large binary grid output files + + cd $DATA + + FHINCP=$(( DTPNT_WAV / 3600 )) + FHINCG=$(( DTFLD_WAV / 3600 )) + if [ $fhr = $fhrg ] + then + if [ $FHMAX_HF_WAV -gt 0 ] && [ $FHOUT_HF_WAV -gt 0 ] && [ $fhr -lt $FHMAX_HF_WAV ]; then + FHINCG=$FHOUT_HF_WAV + else + FHINCG=$FHOUT_WAV + fi + fhrg=$((fhr+FHINCG)) + fi + if [ $fhr = $fhrp ] + then + fhrp=$((fhr+FHINCP)) + fi + echo $fhrg $fhrp + fhr=$([ $fhrg -le $fhrp ] && echo "$fhrg" || echo "$fhrp") # reference fhr is the least between grid and point stride + done + +# --------------------------------------------------------------------------- # +# 3. Compress point output data into tar files + +# 3.a Set up cmdfile + + rm -f cmdtarfile + touch cmdtarfile + chmod 744 cmdtarfile + + set +x + echo ' ' + echo ' Making command file for taring all point output files.' + + [[ "$LOUD" = YES ]] && set -x + +# 6.b Spectral data files + + if [ "$specOK" = 'yes' ] + then + echo "$USHwave/wave_tar.sh $WAV_MOD_TAG ibp $Nibp > ${WAV_MOD_TAG}_ibp_tar.out 2>&1 " >> cmdtarfile + echo "$USHwave/wave_tar.sh $WAV_MOD_TAG spec $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "$USHwave/wave_tar.sh $WAV_MOD_TAG bull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + echo "$USHwave/wave_tar.sh $WAV_MOD_TAG cbull $Nb > ${WAV_MOD_TAG}_spec_tar.out 2>&1 " >> cmdtarfile + fi + + wavenproc=`wc -l cmdtarfile | awk '{print $1}'` + wavenproc=`echo $((${wavenproc}<${NTASKS}?${wavenproc}:${NTASKS}))` + + set +x + echo ' ' + echo " Executing the copy command file at : `date`" + echo ' ------------------------------------' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + if [ "$wavenproc" -gt '1' ] + then + ${wavempexec} ${wavenproc} ${wave_mpmd} cmdtarfile + exit=$? + else + chmod 744 cmdtarfile + ./cmdtarfile + exit=$? + fi + + if [ "$exit" != '0' ] + then + set +x + echo ' ' + echo '********************************************' + echo '*** CMDFILE FAILED ***' + echo '********************************************' + echo ' See Details Below ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + err=8; export err;${errchk} + exit $err + fi + +# --------------------------------------------------------------------------- # +# 7. Ending output + + set +x + echo ' ' + echo "Ending at : `date`" + echo '-----------' + echo ' ' + echo ' *** End of MWW3 postprocessor ***' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + if [ "$exit_code" -ne '0' ] + then + msg="ABNORMAL EXIT: Problem in MWW3 POST" + postmsg "$jlogfile" "$msg" + echo $msg + err=16; export err;${errchk} + exit $err + else + echo " Side-by-Side Wave Post Completed Normally " + msg="$job completed normally" + postmsg "$jlogfile" "$msg" + exit 0 + fi + +# End of MWW3 prostprocessor script ---------------------------------------- # diff --git a/scripts/exwave_prep.sh b/scripts/exwave_prep.sh new file mode 100755 index 0000000000..1cd88fd41a --- /dev/null +++ b/scripts/exwave_prep.sh @@ -0,0 +1,1007 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: exwave_prep.sh +# Script description: Creates output products from binary WW3 data +# +# Author: Hendrik Tolman Org: NCEP/EMC Date: 2007-03-01 +# Abstract: This is the preprocessor for the wave component in GFS. +# It executes several scripts for preparing and creating input data +# as follows: +# +# wave_prnc_ice.sh : preprocess ice fields. # +# wave_prnc_wnd.sh : preprocess wind fields (uncoupled run, not active) # +# wave_prnc_cur.sh : preprocess current fields. # +# wave_g2ges.sh : find and copy wind grib2 files. # +# # +# Remarks : # +# - For non-fatal errors output is witten to the wave.log file. # +# # +# Update record : # +# # +# - Origination: 01-Mar-2007 # +# # +# Update log # +# Mar2007 HTolman - Added NCO note on resources on mist/dew # +# Apr2007 HTolman - Renaming mod_def files in $FIX_wave. # +# Mar2011 AChawla - Migrating to a vertical structure # +# Nov2012 JHAlves - Transitioning to WCOSS # +# Apr2019 JHAlves - Transitioning to GEFS workflow # +# Nov2019 JHAlves - Merging wave scripts to global workflow # +# # +# WAV_MOD_ID and WAV_MOD_TAG replace modID. WAV_MOD_TAG # +# is used for ensemble-specific I/O. For deterministic # +# WAV_MOD_ID=WAV_MOD_TAG # +# # +############################################################################### +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + set -x + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + cd $DATA + mkdir outtmp + + msg="HAS BEGUN on `hostname`" + postmsg "$jlogfile" "$msg" + msg="Starting MWW3 PREPROCESSOR SCRIPT for $WAV_MOD_TAG" + postmsg "$jlogfile" "$msg" + + set +x + echo ' ' + echo ' ********************************' + echo ' *** MWW3 PREPROCESSOR SCRIPT ***' + echo ' ********************************' + echo ' PREP for wave component of NCEP coupled system' + echo " Wave component identifier : $WAV_MOD_TAG " + echo ' ' + echo "Starting at : `date`" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# export MP_PGMMODEL=mpmd +# export MP_CMDFILE=./cmdfile + + if [ "$INDRUN" = 'no' ] + then + FHMAX_WAV=${FHMAX_WAV:-3} + else + FHMAX_WAV=${FHMAX_WAV:-384} + fi + +# 0.b Date and time stuff + +# Beginning time for outpupt may differ from SDATE if DOIAU=YES + export date=$PDY + export YMDH=${PDY}${cyc} +# Roll back $IAU_FHROT hours of DOIAU=YES + IAU_FHROT=3 + if [ "$DOIAU" = "YES" ] + then + WAVHINDH=$(( WAVHINDH + IAU_FHROT )) + fi +# Set time stamps for model start and output +# For special case when IAU is on but this is an initial half cycle + if [ $IAU_OFFSET = 0 ]; then + ymdh_beg=$YMDH + else + ymdh_beg=`$NDATE -$WAVHINDH $YMDH` + fi + time_beg="`echo $ymdh_beg | cut -c1-8` `echo $ymdh_beg | cut -c9-10`0000" + ymdh_end=`$NDATE $FHMAX_WAV $YMDH` + time_end="`echo $ymdh_end | cut -c1-8` `echo $ymdh_end | cut -c9-10`0000" + ymdh_beg_out=$YMDH + time_beg_out="`echo $ymdh_beg_out | cut -c1-8` `echo $ymdh_beg_out | cut -c9-10`0000" + +# Restart file times (already has IAU_FHROT in WAVHINDH) + RSTOFFSET=$(( ${WAVHCYC} - ${WAVHINDH} )) +# Update restart time is added offset relative to model start + RSTOFFSET=$(( ${RSTOFFSET} + ${RSTIOFF_WAV} )) + ymdh_rst_ini=`$NDATE ${RSTOFFSET} $YMDH` + RST2OFFSET=$(( DT_2_RST_WAV / 3600 )) + ymdh_rst2_ini=`$NDATE ${RST2OFFSET} $ymdh_rst_ini` # DT2 relative to first-first-cycle restart file +# First restart file for cycling + time_rst_ini="`echo $ymdh_rst_ini | cut -c1-8` `echo $ymdh_rst_ini | cut -c9-10`0000" + if [ ${DT_1_RST_WAV} = 1 ]; then + time_rst1_end=${time_rst_ini} + else + RST1OFFSET=$(( DT_1_RST_WAV / 3600 )) + ymdh_rst1_end=`$NDATE $RST1OFFSET $ymdh_rst_ini` + time_rst1_end="`echo $ymdh_rst1_end | cut -c1-8` `echo $ymdh_rst1_end | cut -c9-10`0000" + fi +# Second restart file for checkpointing + time_rst2_ini="`echo $ymdh_rst2_ini | cut -c1-8` `echo $ymdh_rst2_ini | cut -c9-10`0000" + time_rst2_end=$time_end +# Condition for gdas run or any other run when checkpoint stamp is > ymdh_end + if [ $ymdh_rst2_ini -ge $ymdh_end ]; then + ymdh_rst2_ini=`$NDATE 3 $ymdh_end` + time_rst2_ini="`echo $ymdh_rst2_ini | cut -c1-8` `echo $ymdh_rst2_ini | cut -c9-10`0000" + time_rst2_end=$time_rst2_ini + fi + + set +x + echo ' ' + echo 'Times in wave model format :' + echo '----------------------------' + echo " date / cycle : $date $cycle" + echo " starting time : $time_beg" + echo " ending time : $time_end" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# Script will run only if pre-defined NTASKS +# The actual work is distributed over these tasks. + if [ -z ${NTASKS} ] + then + echo "FATAL ERROR: Requires NTASKS to be set " + err=1; export err;${errchk} + fi + +# --------------------------------------------------------------------------- # +# 1. Get files that are used by most child scripts + + set +x + echo 'Preparing input files :' + echo '-----------------------' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# 1.a Model definition files + + rm -f cmdfile + touch cmdfile + + grdINP='' + if [ "${WW3ATMINP}" = 'YES' ]; then grdINP="${grdINP} $WAVEWND_FID" ; fi + if [ "${WW3ICEINP}" = 'YES' ]; then grdINP="${grdINP} $WAVEICE_FID" ; fi + if [ "${WW3CURINP}" = 'YES' ]; then grdINP="${grdINP} $WAVECUR_FID" ; fi + + ifile=1 + + for grdID in $grdINP $waveGRD + do + if [ -f "$COMIN/rundata/${COMPONENTwave}.mod_def.${grdID}" ] + then + set +x + echo " Mod def file for $grdID found in ${COMIN}/rundata. copying ...." + [[ "$LOUD" = YES ]] && set -x + cp $COMIN/rundata/${COMPONENTwave}.mod_def.${grdID} mod_def.$grdID + + else + msg="FATAL ERROR: NO MODEL DEFINITION FILE" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '*********************************************************** ' + echo '*** FATAL ERROR : NOT FOUND WAVE MODEL DEFINITION FILE *** ' + echo '*********************************************************** ' + echo " grdID = $grdID" + echo ' ' + echo $msg + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG prep $date $cycle : ${COMPONENTwave}.mod_def.${grdID} missing." >> $wavelog + err=2;export err;${errchk} + fi + done + +# 1.b Netcdf Preprocessor template files + + for grdID in $grdINP + do + + case $grdID in + $WAVECUR_FID ) + type='cur' + ;; + $WAVEWND_FID ) + type='wind' + ;; + $WAVEICE_FID ) + type='ice' + ;; + * ) + echo 'Input type not yet implemented' + err=3; export err;${errchk} + ;; + esac + + if [ -f $FIXwave/ww3_prnc.${type}.$grdID.inp.tmpl ] + then + cp $FIXwave/ww3_prnc.${type}.$grdID.inp.tmpl . + fi + + if [ -f ww3_prnc.${type}.$grdID.inp.tmpl ] + then + set +x + echo ' ' + echo " ww3_prnc.${type}.$grdID.inp.tmpl copied ($FIXwave)." + echo ' ' + [[ "$LOUD" = YES ]] && set -x + else + msg="ABNORMAL EXIT: NO FILE $file" + ./postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '************************************** ' + echo '*** FATAL ERROR : NO TEMPLATE FILE *** ' + echo '************************************** ' + echo " ww3_prnc.${type}.$grdID.inp.tmpl" + echo ' ' + echo $msg + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG prep $date $cycle : ww3_prnc.${type}.$grdID.tmpl missing." >> $wavelog + err=4;export err;${errchk} + fi + done + +# --------------------------------------------------------------------------- # +# ICEC processing + + if [ "${WW3ICEINP}" = 'YES' ]; then + +# --------------------------------------------------------------------------- # +# 2. Ice pre - processing + +# 2.a Check if ice input is perturbed (number of inputs equal to number of wave +# ensemble members + if [ "${RUNMEM}" = "-1" ] || [ "${WW3ICEIENS}" = "T" ] || [ "$waveMEMB" = "00" ] + then + + $USHwave/wave_prnc_ice.sh > wave_prnc_ice.out + ERR=$? + + if [ -d ice ] + then + postmsg "$jlogfile" "FATAL ERROR ice field not generated." + set +x + echo ' ' + echo ' FATAL ERROR: ice field not generated ' + echo ' ' + sed "s/^/ice.out : /g" ice.out + echo ' ' + [[ "$LOUD" = YES ]] && set -x + err=5;export err;${errchk} + else + mv -f ice.out $DATA/outtmp + rm -f ww3_prep.$WAVEICE_FID.tmpl mod_def.$WAVEICE_FID + set +x + echo ' ' + echo ' Ice field unpacking successful.' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + fi + else + echo ' ' + echo " Ice input is not perturbed, single ice file generated, skipping ${WAV_MOD_TAG}" + echo ' ' + fi + else + echo ' ' + echo ' No input ice file generated, this run did not request pre-processed ice data ' + echo ' ' + fi + +# --------------------------------------------------------------------------- # +# WIND processing (not functional, TBD for uncoupled cases) + + if [ "${WW3ATMINP}" = 'YES' ]; then + +# --------------------------------------------------------------------------- # +# 3. Wind pre-processing + + if [ "${RUNMEM}" = "-1" ] || [ "${WW3ATMIENS}" = "T" ] || [ "$waveMEMB" = "00" ] + then + + rm -f cmdfile + touch cmdfile + chmod 744 cmdfile + +# 3.a Gather and pre-process grib2 files + ymdh=$ymdh_beg + + while [ "$ymdh" -le "$ymdh_end" ] + do + echo "$USHwave/wave_g2ges.sh $ymdh > grb_$ymdh.out 2>&1" >> cmdfile + ymdh=`$NDATE $WAV_WND_HOUR_INC $ymdh` + done + +# 3.b Execute the serial or parallel cmdfile + +# Set number of processes for mpmd + cat cmdfile + + wavenproc=`wc -l cmdfile | awk '{print $1}'` + wavenproc=`echo $((${wavenproc}<${NTASKS}?${wavenproc}:${NTASKS}))` + + set +x + echo ' ' + echo " Executing the copy command file at : `date`" + echo ' ------------------------------------' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + if [ "$wavenproc" -gt '1' ] + then + ${wavempexec} ${wavenproc} ${wave_mpmd} cmdfile + exit=$? + else + ./cmdfile + exit=$? + fi + + if [ "$exit" != '0' ] + then + set +x + echo ' ' + echo '********************************************' + echo '*** CMDFILE FAILED IN WIND GENERATION ***' + echo '********************************************' + echo ' See Details Below ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + fi + +# 3.c Check for errors + + set +x + echo ' ' + echo ' Checking for errors.' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# We will go on if the number of errors in files is less +# than err_max + + [[ "$LOUD" = YES ]] && set -x + err_max=1 + + + ymdh=$ymdh_beg + nr_err=0 + + set +x + echo ' Sources of grib2 files :' + [[ "$LOUD" = YES ]] && set -x + while [ "$ymdh" -le "$ymdh_end" ] + do + if [ -d grb_${ymdh} ] + then + set +x + echo ' ' + echo " File for $ymdh : error in wave_g2ges.sh" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" " File for $ymdh : error in wave_g2ges.sh" + nr_err=`expr $nr_err + 1` + rm -f gwnd.$ymdh + else + grbfile=`grep 'File for' grb_${ymdh}.out` + if [ -z "$grbfile" ] + then + set +x + echo ' ' + echo " File for $ymdh : cannot identify source" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + nr_err=`expr $nr_err + 1` + rm -f gwnd.$ymdh + else + if [ ! -f gwnd.$ymdh ] + then + set +x + echo ' ' + echo " File for $ymdh : file not found" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + nr_err=`expr $nr_err + 1` + else + set +x + echo ' ' + echo " $grbfile" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + mv -f grb_${ymdh}.out $DATA/outtmp + fi + fi + fi + ymdh=`$NDATE $WAV_WND_HOUR_INC $ymdh` + done + + if [ -f grb_*.out ] + then + set +x + echo ' ' + echo '**********************************' + echo '*** ERROR OUTPUT wave_g2ges.sh ***' + echo '**********************************' + echo ' Possibly in multiple calls' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG prep $date $cycle : error in wind grib2 files." >> $wavelog + set +x + for file in grb_*.out + do + echo ' ' + sed "s/^/$file : /g" $file + done + echo ' ' + [[ "$LOUD" = YES ]] && set -x + mv -f grb_*.out $DATA/outtmp + postmsg "$jlogfile" "NON-FATAL ERROR in wave_g2ges.sh, possibly in multiple calls." + fi + + if [ "$nr_err" -gt "$err_max" ] + then + msg="ABNORMAL EXIT: TOO MANY MISSING WIND INPUT GRB2 FILES" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '********************************************* ' + echo '*** FATAL ERROR : ERROR(S) IN WIND FILES *** ' + echo '********************************************* ' + echo ' ' + echo $msg + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG prep $date $cycle : fatal error in grib2 wind files." >> $wavelog + err=6;export err;${errchk} + fi + + rm -f cmdfile + +# 3.d Getwind data into single file + + set +x + echo ' ' + echo ' Concatenate extracted wind fields ...' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + files=`ls gwnd.* 2> /dev/null` + + if [ -z "$files" ] + then + msg="ABNORMAL EXIT: NO gwnd.* FILES FOUND" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '******************************************** ' + echo '*** FATAL ERROR : CANNOT FIND WIND FILES *** ' + echo '******************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG prep $date $cycle : no wind files found." >> $wavelog + err=7;export err;${errchk} + fi + + rm -f gfs.wind + + for file in $files + do + cat $file >> gfs.wind + rm -f $file + done + +# 3.e Run ww3_prnc + +# Convert gfs wind to netcdf + $WGRIB2 gfs.wind -netcdf gfs.nc + + for grdID in $WAVEWND_FID $curvID + do + + set +x + echo ' ' + echo " Running wind fields through preprocessor for grid $grdID" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + sed -e "s/HDRFL/T/g" ww3_prnc.wind.$grdID.tmpl > ww3_prnc.inp + ln -sf mod_def.$grdID mod_def.ww3 + + set +x + echo "Executing $EXECcode/ww3_prnc" + [[ "$LOUD" = YES ]] && set -x + + $EXECcode/ww3_prnc > prnc.out + err=$? + + if [ "$err" != '0' ] + then + msg="ABNORMAL EXIT: ERROR IN waveprnc" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '*************************************** ' + echo '*** FATAL ERROR : ERROR IN waveprnc *** ' + echo '*************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG prep $grdID $date $cycle : error in waveprnc." >> $wavelog + err=8;export err;${errchk} + fi + + if [ ! -f wind.ww3 ] + then + msg="ABNORMAL EXIT: FILE wind.ww3 MISSING" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + cat waveprep.out + echo ' ' + echo '****************************************' + echo '*** FATAL ERROR : wind.ww3 NOT FOUND ***' + echo '****************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG prep $grdID $date $cycle : wind.ww3 missing." >> $wavelog + err=9;export err;${errchk} + fi + + rm -f mod_def.ww3 + rm -f ww3_prep.inp + + mv wind.ww3 wind.$grdID + mv times.WND times.$grdID + +# 3.f Check to make sure wind files are properly incremented + + first_pass='yes' + windOK='yes' + while read line + do + date1=`echo $line | cut -d ' ' -f 1` + date2=`echo $line | cut -d ' ' -f 2` + ymdh="$date1`echo $date2 | cut -c1-2`" + if [ "$first_pass" = 'no' ] + then + hr_inc=`$NHOUR $ymdh $ymdh_prev` + if [ "${hr_inc}" -gt "${WAV_WND_HOUR_INC}" ] + then + set +x + echo "Incorrect wind forcing increment at $ymdh" + [[ "$LOUD" = YES ]] && set -x + windOK='no' + fi + fi + ymdh_prev=$ymdh + first_pass='no' + done < times.$grdID + + if [ "$windOK" = 'no' ] + then + set +x + echo ' ' + echo '************************************************' + echo '*** ERROR : WIND DATA INCREMENT INCORRECT !! ***' + echo '************************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG prep $grdID $date $cycle : error in wind increment." >> $wavelog + err=10;export err;${errchk} + fi + + done + + rm -f gfs.wind + rm -f mod_def.ww3 + rm -f ww3_prnc.inp + else + echo ' ' + echo " Wind input is not perturbed, single wnd file generated, skipping ${WAV_MOD_TAG}" + echo ' ' + + fi + + else + + echo ' ' + echo ' Atmospheric inputs not generated, this run did not request pre-processed winds ' + echo ' ' + + fi + +#------------------------------------------------------------------- +# CURR processing (not functional, TBD for uncoupled and GFSv16 cases) + + if [ "${WW3CURINP}" = 'YES' ]; then + +#------------------------------------------------------------------- +# 4. Process current fields +# 4.a Get into single file + if [ "${RUNMEM}" = "-1" ] || [ "${WW3CURIENS}" = "T" ] || [ "$waveMEMB" = "00" ] + then + + set +x + echo ' ' + echo ' Concatenate binary current fields ...' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# Prepare files for cfp process + rm -f cmdfile + touch cmdfile + chmod 744 cmfile + + ymdh_rtofs=${PDY}00 # RTOFS runs once daily + ymdh_end=`$NDATE ${FHMAX_WAV_CUR} ${ymdh_rtofs}` + NDATE_DT=${WAV_CUR_HF_DT} + FLGHF='T' + + while [ "$ymdh_rtofs" -le "$ymdh_end" ] + do +# Timing has to be made relative to the single 00z RTOFS cycle for that PDY + fhr_rtofs=`${NHOUR} ${ymdh_rtofs} ${PDY}00` + fext='f' + + if [ ${fhr_rtofs} -lt 0 ] + then +# Data from nowcast phase + fhr_rtofs=`expr 48 + ${fhr_rtofs}` + fext='n' + fi + + fhr_rtofs=`printf "%03d\n" ${fhr_rtofs}` + + curfile1h=${COMIN_WAV_CUR}/rtofs_glo_2ds_${fext}${fhr_rtofs}_1hrly_prog.nc + curfile3h=${COMIN_WAV_CUR}/rtofs_glo_2ds_${fext}${fhr_rtofs}_3hrly_prog.nc + + if [ -s ${curfile1h} ] && [ "${FLGHF}" = "T" ] ; then + curfile=${curfile1h} + elif [ -s ${curfile3h} ]; then + curfile=${curfile3h} + FLGHF='F' + else + echo ' ' + set $setoff + echo ' ' + echo '************************************** ' + echo "*** FATAL ERROR: NO CUR FILE $curfile *** " + echo '************************************** ' + echo ' ' + set $seton + postmsg "$jlogfile" "FATAL ERROR - NO CURRENT FILE (RTOFS)" + err=11;export err;${errchk} + exit 0 + echo ' ' + fi + + echo "$USHwave/wave_prnc_cur.sh $ymdh_rtofs $curfile > cur_$ymdh_rtofs.out 2>&1" >> cmdfile + if [ $fhr_rtofs -ge ${WAV_CUR_HF_FH} ] ; then + NDATE_DT=${WAV_CUR_DT} + fi + ymdh_rtofs=`$NDATE $NDATE_DT $ymdh_rtofs` + done + +# Set number of processes for mpmd + wavenproc=`wc -l cmdfile | awk '{print $1}'` + wavenproc=`echo $((${wavenproc}<${NTASKS}?${wavenproc}:${NTASKS}))` + + set +x + echo ' ' + echo " Executing the copy command file at : `date`" + echo ' ------------------------------------' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + if [ $wavenproc -gt '1' ] + then + ${wavempexec} ${wavenproc} ${wave_mpmd} cmdfile + exit=$? + else + chmod 744 ./cmdfile + ./cmdfile + exit=$? + fi + + if [ "$exit" != '0' ] + then + set +x + echo ' ' + echo '********************************************' + echo '*** CMDFILE FAILED IN CUR GENERATION ***' + echo '********************************************' + echo ' See Details Below ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + fi + + files=`ls ${WAVECUR_FID}.* 2> /dev/null` + + if [ -z "$files" ] + then + msg="ABNORMAL EXIT: NO ${WAVECUR_FID}.* FILES FOUND" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '******************************************** ' + echo '*** FATAL ERROR : CANNOT FIND CURR FILES *** ' + echo '******************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + echo "$WAV_MOD_TAG prep $date $cycle : no current files found." >> $wavelog + err=11;export err;${errchk} + fi + + rm -f cur.${WAVECUR_FID} + + for file in $files + do + cat $file >> cur.${WAVECUR_FID} + rm -f $file + done + + cp -f cur.${WAVECUR_FID} ${COMOUT}/rundata/${COMPONENTwave}.${WAVECUR_FID}.$cycle.cur + + else + echo ' ' + echo " Current input is not perturbed, single cur file generated, skipping ${WAV_MOD_TAG}" + echo ' ' + fi + + else + + echo ' ' + echo ' Current inputs not generated, this run did not request pre-processed currents ' + echo ' ' + + fi + +# --------------------------------------------------------------------------- # +# 5. Create ww3_multi.inp + +# 5.a ww3_multi template + + if [ -f $FIXwave/ww3_multi.${NET}.inp.tmpl ] + then + cp $FIXwave/ww3_multi.${NET}.inp.tmpl ww3_multi.inp.tmpl + fi + + if [ ! -f ww3_multi.inp.tmpl ] + then + msg="ABNORMAL EXIT: NO TEMPLATE FOR INPUT FILE" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '************************************************ ' + echo '*** FATAL ERROR : NO TEMPLATE FOR INPUT FILE *** ' + echo '************************************************ ' + echo ' ' + echo "${WAV_MOD_TAG} fcst $date $cycle : ww3_multi file missing." >> $wavelog + echo $msg + [[ "$LOUD" = YES ]] && set -x + err=12;export err;${errchk} + fi + +# 5.b Buoy location file + + if [ -f $FIXwave/wave_${NET}.buoys ] + then + cp $FIXwave/wave_${NET}.buoys buoy.loc + fi + + if [ -f buoy.loc ] + then + set +x + echo " buoy.loc copied ($FIXwave/wave_${NET}.buoys)." + [[ "$LOUD" = YES ]] && set -x + else + set +x + echo " buoy.loc not found. **** WARNING **** " + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" " FATAL ERROR : buoy.loc ($FIXwave/wave_${NET}.buoys) NOT FOUND" + touch buoy.loc + echo "$WAV_MOD_TAG fcst $date $cycle : no buoy locations file ($FIXwave/wave_${NET}.buoys)." >> $wavelog + err=13;export err;${errchk} + fi + +# Initialize inp file parameters + NFGRIDS=0 + NMGRIDS=0 + CPLILINE='$' + ICELINE='$' + ICEFLAG='no' + CURRLINE='$' + CURRFLAG='no' + WINDLINE='$' + WINDFLAG='no' + UNIPOINTS='$' + +# Check for required inputs and coupling options + if [ $waveuoutpGRD ] + then + UNIPOINTS="'$waveuoutpGRD'" + fi + +# Check if waveesmfGRD is set + if [ ${waveesmfGRD} ] + then + NFGRIDS=`expr $NFGRIDS + 1` + fi + + case ${WW3ATMINP} in + 'YES' ) + NFGRIDS=`expr $NFGRIDS + 1` + WINDLINE=" '$WAVEWND_FID' F F T F F F F" + WINDFLAG="$WAVEWND_FID" + ;; + 'CPL' ) + WINDFLAG="CPL:${waveesmfGRD}" + WNDIFLAG='T' + CPLILINE=" '${waveesmfGRD}' F F T F F F F" + ;; + esac + + case ${WW3ICEINP} in + 'YES' ) + NFGRIDS=`expr $NFGRIDS + 1` + ICELINE=" '$WAVEICE_FID' F F F T F F F" + ICEFLAG="$WAVEICE_FID" + ;; + 'CPL' ) + ICEFLAG="CPL:${waveesmfGRD}" + ICEIFLAG='T' + CPLILINE=" '${waveesmfGRD}' F F ${WNDIFLAG} T F F F" + ;; + esac + + case ${WW3CURINP} in + 'YES' ) + NFGRIDS=`expr $NFGRIDS + 1` + CURRLINE=" '$WAVECUR_FID' F T F F F F F" + CURRFLAG="$WAVECUR_FID" + ;; + 'CPL' ) + CURRFLAG="CPL:${waveesmfGRD}" + CURIFLAG='T' + CPLILINE=" '${waveesmfGRD}' F T ${WNDIFLAG} ${ICEIFLAG} F F F" + ;; + esac + + unset agrid + agrid= + gline= + GRDN=0 +# grdGRP=1 # Single group for now + for grid in ${waveGRD} + do + GRDN=`expr ${GRDN} + 1` + agrid=( ${agrid[*]} ${grid} ) + NMGRIDS=`expr $NMGRIDS + 1` + gridN=`echo $waveGRDN | awk -v i=$GRDN '{print $i}'` + gridG=`echo $waveGRDG | awk -v i=$GRDN '{print $i}'` + gline="${gline}'${grid}' 'no' 'CURRFLAG' 'WINDFLAG' 'ICEFLAG' 'no' 'no' 'no' ${gridN} ${gridG} 0.00 1.00 F\n" + done + gline="${gline}\$" + echo $gline + + sed -e "s/NFGRIDS/$NFGRIDS/g" \ + -e "s/NMGRIDS/${NMGRIDS}/g" \ + -e "s/FUNIPNT/${FUNIPNT}/g" \ + -e "s/PNTSRV/${PNTSRV}/g" \ + -e "s/FPNTPROC/${FPNTPROC}/g" \ + -e "s/FGRDPROC/${FGRDPROC}/g" \ + -e "s/OUTPARS/${OUTPARS_WAV}/g" \ + -e "s/CPLILINE/${CPLILINE}/g" \ + -e "s/UNIPOINTS/${UNIPOINTS}/g" \ + -e "s/GRIDLINE/${gline}/g" \ + -e "s/ICELINE/$ICELINE/g" \ + -e "s/CURRLINE/$CURRLINE/g" \ + -e "s/WINDLINE/$WINDLINE/g" \ + -e "s/ICEFLAG/$ICEFLAG/g" \ + -e "s/CURRFLAG/$CURRFLAG/g" \ + -e "s/WINDFLAG/$WINDFLAG/g" \ + -e "s/RUN_BEG/$time_beg/g" \ + -e "s/RUN_END/$time_end/g" \ + -e "s/OUT_BEG/$time_beg_out/g" \ + -e "s/OUT_END/$time_end/g" \ + -e "s/DTFLD/ $DTFLD_WAV/g" \ + -e "s/GOFILETYPE/ $GOFILETYPE/g" \ + -e "s/POFILETYPE/ $POFILETYPE/g" \ + -e "s/FIELDS/$FIELDS/g" \ + -e "s/DTPNT/ $DTPNT_WAV/g" \ + -e "/BUOY_FILE/r buoy.loc" \ + -e "s/BUOY_FILE/DUMMY/g" \ + -e "s/RST_BEG/$time_rst_ini/g" \ + -e "s/RSTTYPE/$RSTTYPE_WAV/g" \ + -e "s/RST_2_BEG/$time_rst2_ini/g" \ + -e "s/DTRST/$DT_1_RST_WAV/g" \ + -e "s/DT_2_RST/$DT_2_RST_WAV/g" \ + -e "s/RST_END/$time_rst1_end/g" \ + -e "s/RST_2_END/$time_rst2_end/g" \ + ww3_multi.inp.tmpl | \ + sed -n "/DUMMY/!p" > ww3_multi.inp + + rm -f ww3_multi.inp.tmpl buoy.loc + + if [ -f ww3_multi.inp ] + then + echo " Copying file ww3_multi.${WAV_MOD_TAG}.inp to $COMOUT " + cp ww3_multi.inp ${COMOUT}/rundata/ww3_multi.${WAV_MOD_TAG}.$cycle.inp + else + echo "FATAL ERROR: file ww3_multi.${WAV_MOD_TAG}.$cycle.inp NOT CREATED, ABORTING" + err=13;export err;${errchk} + fi + +# 6. Copy rmp grid remapping pre-processed coefficients + + if ls $FIXwave/rmp_src_to_dst_conserv_* 2> /dev/null + then + for file in $(ls $FIXwave/rmp_src_to_dst_conserv_*) ; do + cp -f $file ${COMOUT}/rundata + done + else + msg="NO rmp precomputed nc files found, is this OK???" + postmsg "$jlogfile" "$msg" + set +x + echo ' ' + echo '************************************************ ' + echo '*** FATAL ERROR : NO PRECOMPUTED RMP FILES FOUND *** ' + echo '************************************************ ' + echo ' ' + echo "${WAV_MOD_TAG} prep $date $cycle : rmp*.nc not found." >> $wavelog + echo $msg + [[ "$LOUD" = YES ]] && set -x + err=13;export err;${errchk} + fi + + +# --------------------------------------------------------------------------- # +# 6. Output to /com + + if [ "$SENDCOM" = 'YES' ] + then + + if [ "${WW3ATMINP}" = 'YES' ]; then + + for grdID in $WAVEWND_FID $curvID + do + set +x + echo ' ' + echo " Saving wind.$grdID as $COMOUT/rundata/${WAV_MOD_TAG}.$grdID.$PDY$cyc.wind" + echo " Saving times.$grdID file as $COMOUT/rundata/${WAV_MOD_TAG}.$grdID.$PDY$cyc.$grdID.wind.times" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + cp wind.$grdID $COMOUT/rundata/${WAV_MOD_TAG}.$grdID.$PDY$cyc.wind + cp times.$grdID $COMOUT/rundata/${WAV_MOD_TAG}.$grdID.$PDY$cyc.$grdID.wind.times + done + fi + +# if [ "${WW3CURINP}" = 'YES' ]; then +# +# for grdID in $WAVECUR_FID +# do +# set +x +# echo ' ' +# echo " Saving cur.$grdID as $COMOUT/rundata/${WAV_MOD_TAG}.$grdID.$PDY$cyc.cur" +# echo ' ' +# [[ "$LOUD" = YES ]] && set -x +# cp cur.$grdID $COMOUT/rundata/${WAV_MOD_TAG}.$grdID.$PDY$cyc.cur +# done +# fi + fi + + rm -f wind.* + rm -f $WAVEICE_FID.* + rm -f times.* + +# --------------------------------------------------------------------------- # +# 7. Ending output + + set +x + echo ' ' + echo "Ending at : `date`" + echo ' ' + echo ' *** End of MWW3 preprocessor ***' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + msg="$job completed normally" + postmsg "$jlogfile" "$msg" + +# End of MWW3 preprocessor script ------------------------------------------- # diff --git a/sorc/build_all.sh b/sorc/build_all.sh index 4637eee423..87df344a01 100755 --- a/sorc/build_all.sh +++ b/sorc/build_all.sh @@ -126,18 +126,20 @@ fi } #------------------------------------ -# build gfs_wafs -#------------------------------------ -$Build_gfs_wafs && { -echo " .... Building gfs_wafs .... " -./build_gfs_wafs.sh > $logs_dir/build_gfs_wafs.log 2>&1 -rc=$? -if [[ $rc -ne 0 ]] ; then +# build gfs_wafs - optional checkout +#------------------------------------ +if [ -d gfs_wafs.fd ]; then + $Build_gfs_wafs && { + echo " .... Building gfs_wafs .... " + ./build_gfs_wafs.sh > $logs_dir/build_gfs_wafs.log 2>&1 + rc=$? + if [[ $rc -ne 0 ]] ; then echo "Fatal error in building gfs_wafs." echo "The log file is in $logs_dir/build_gfs_wafs.log" -fi -((err+=$rc)) + fi + ((err+=$rc)) } +fi #------------------------------------ # build gaussian_sfcanl @@ -268,24 +270,6 @@ if [ $target = wcoss -o $target = wcoss_cray -o $target = wcoss_dell_p3 ]; then } fi -#------------------------------------ -# build prod_util -#------------------------------------ -$Build_prod_util && { -echo " .... prod_util build not currently supported .... " -#echo " .... Building prod_util .... " -#./build_prod_util.sh > $logs_dir/build_prod_util.log 2>&1 -} - -#------------------------------------ -# build grib_util -#------------------------------------ -$Build_grib_util && { -echo " .... grib_util build not currently supported .... " -#echo " .... Building grib_util .... " -#./build_grib_util.sh > $logs_dir/build_grib_util.log 2>&1 -} - #------------------------------------ # Exception Handling #------------------------------------ diff --git a/sorc/build_fv3.sh b/sorc/build_fv3.sh index b39b030f61..5659b53316 100755 --- a/sorc/build_fv3.sh +++ b/sorc/build_fv3.sh @@ -21,5 +21,5 @@ if [ $target = hera ]; then target=hera.intel ; fi cd fv3gfs.fd/ FV3=$( pwd -P )/FV3 cd tests/ -./compile.sh "$FV3" "$target" "NCEP64LEV=Y HYDRO=N 32BIT=Y" 1 YES YES +./compile.sh "$FV3" "$target" "WW3=Y 32BIT=Y" 1 mv -f fv3_1.exe ../NEMS/exe/global_fv3gfs.x diff --git a/sorc/build_grib_util.sh b/sorc/build_grib_util.sh deleted file mode 100755 index 6569cc22c0..0000000000 --- a/sorc/build_grib_util.sh +++ /dev/null @@ -1,88 +0,0 @@ -#! /usr/bin/env bash -set -eux - -source ./machine-setup.sh > /dev/null 2>&1 -cwd=`pwd` - -USE_PREINST_LIBS=${USE_PREINST_LIBS:-"true"} -if [ $USE_PREINST_LIBS = true ]; then - export MOD_PATH=/scratch3/NCEPDEV/nwprod/lib/modulefiles - source ../modulefiles/modulefile.grib_util.$target > /dev/null 2>&1 -else - export MOD_PATH=${cwd}/lib/modulefiles - if [ $target = wcoss_cray ]; then - source ../modulefiles/modulefile.grib_util.${target}_userlib > /dev/null 2>&1 - else - source ../modulefiles/modulefile.grib_util.$target > /dev/null 2>&1 - fi -fi - -# Move to util/sorc folder -cd ../util/sorc - -# Check final exec folder exists -if [ ! -d "../exec" ]; then - mkdir ../exec -fi - -for grib_util in cnvgrib copygb2 degrib2 grbindex tocgrib2 tocgrib \ - copygb grb2index grib2grib tocgrib2super -do - cd $grib_util.fd - make -f makefile_$target clean - make -f makefile_$target - make -f makefile_$target install - make -f makefile_$target clean - cd .. -done - -# -# compile wgrib -# -cd wgrib.cd - make -f makefile_$target clean - make -f makefile_$target - make -f makefile_$target install - make -f makefile_$target clean -cd .. - -# -# compile wgrib2 -# -cd $cwd -source ./machine-setup.sh > /dev/null 2>&1 - -if [ $target = wcoss_cray -a $USE_PREINST_LIBS != true ]; then - source ../modulefiles/modulefile.wgrib2.${target}_userlib > /dev/null 2>&1 -else - source ../modulefiles/modulefile.wgrib2.$target > /dev/null 2>&1 -fi - -# Move to util/sorc folder -cd ../util/sorc -cwd=`pwd` - -#---------------------------------------------------------------- -export CPPFLAGS="-ffast-math -O3 -DGFORTRAN" -cd $cwd/wgrib2.cd/gctpc/ -make -f makefile.gctpc clean -make -f makefile.gctpc -rm -f *.o -#---------------------------------------------------------------- -if [ $target = wcoss_cray ]; then - export FFLAGS=-O2 - cd $cwd/wgrib2.cd/iplib/ - make clean - make - rm -f *.o -fi -#---------------------------------------------------------------- -cd $cwd/wgrib2.cd -module list -make -f makefile_$target clean -make -f makefile_$target -make -f makefile_$target install -make -f makefile_$target clean -#---------------------------------------------------------------- - -exit diff --git a/sorc/build_prod_util.sh b/sorc/build_prod_util.sh deleted file mode 100755 index e4220f7c26..0000000000 --- a/sorc/build_prod_util.sh +++ /dev/null @@ -1,47 +0,0 @@ -#! /usr/bin/env bash -set -eux - -source ./machine-setup.sh > /dev/null 2>&1 -cwd=`pwd` - -USE_PREINST_LIBS=${USE_PREINST_LIBS:-"true"} -if [ $USE_PREINST_LIBS = true ]; then - export MOD_PATH=/scratch3/NCEPDEV/nwprod/lib/modulefiles - source ../modulefiles/modulefile.prod_util.$target > /dev/null 2>&1 -else - export MOD_PATH=${cwd}/lib/modulefiles - if [ $target = wcoss_cray ]; then - source ../modulefiles/modulefile.prod_util.${target}_userlib > /dev/null 2>&1 - else - source ../modulefiles/modulefile.prod_util.$target > /dev/null 2>&1 - fi -fi - -# Move to util/sorc folder -cd ../util/sorc - -# Check final exec folder exists -if [ ! -d "../exec" ]; then - mkdir ../exec -fi - -for prod_util in fsync_file -do - cd $prod_util.cd - make -f makefile clean - make -f makefile - make -f makefile install - make -f makefile clean - cd .. -done - -for prod_util in mdate ndate nhour -do - cd $prod_util.fd - make -f makefile clean - make -f makefile - make -f makefile install - make -f makefile clean - cd .. -done -exit diff --git a/sorc/checkout.sh b/sorc/checkout.sh index f7ab69f631..90e4cd3a94 100755 --- a/sorc/checkout.sh +++ b/sorc/checkout.sh @@ -9,7 +9,7 @@ if [[ ! -d fv3gfs.fd ]] ; then rm -f ${topdir}/checkout-fv3gfs.log git clone https://github.com/ufs-community/ufs-weather-model fv3gfs.fd >> ${topdir}/checkout-fv3gfs.log 2>&1 cd fv3gfs.fd - git checkout GFS.v16.0.0 + git checkout GFS.v16.0.1 git submodule update --init --recursive cd ${topdir} else @@ -21,7 +21,8 @@ if [[ ! -d gsi.fd ]] ; then rm -f ${topdir}/checkout-gsi.log git clone --recursive gerrit:ProdGSI gsi.fd >> ${topdir}/checkout-gsi.log 2>&1 cd gsi.fd - git checkout gfsda.v16.0.0 +# git checkout gfsda.v16.0.0 + git checkout feature/parallel_ncio git submodule update cd ${topdir} else @@ -77,7 +78,7 @@ if [[ ! -d verif-global.fd ]] ; then rm -f ${topdir}/checkout-verif-global.log git clone --recursive gerrit:EMC_verif-global verif-global.fd >> ${topdir}/checkout-verif-global.log 2>&1 cd verif-global.fd - git checkout verif_global_v1.5.0 + git checkout verif_global_v1.6.0 cd ${topdir} else echo 'Skip. Directory verif-global.fd already exist.' diff --git a/sorc/fv3gfs_build.cfg b/sorc/fv3gfs_build.cfg index b724650bc9..130c6dde03 100644 --- a/sorc/fv3gfs_build.cfg +++ b/sorc/fv3gfs_build.cfg @@ -18,8 +18,6 @@ Building fv3nc2nemsio (fv3nc2nemsio) .................. yes Building regrid_nemsio (regrid_nemsio) ................ yes Building gfs_util (gfs_util) .......................... yes - Building prod_util (prod_util) ........................ no - Building grib_util (grib_util) ........................ no # -- END -- diff --git a/sorc/link_fv3gfs.sh b/sorc/link_fv3gfs.sh index 733c856cae..c8364cb3ec 100755 --- a/sorc/link_fv3gfs.sh +++ b/sorc/link_fv3gfs.sh @@ -38,7 +38,7 @@ elif [ $machine = "hera" ]; then FIX_DIR="/scratch1/NCEPDEV/global/glopara/fix" fi cd ${pwd}/../fix ||exit 8 -for dir in fix_am fix_fv3 fix_gldas fix_orog fix_fv3_gmted2010 fix_verif ; do +for dir in fix_am fix_chem fix_fv3 fix_fv3_gmted2010 fix_gldas fix_orog fix_sfc_climo fix_verif fix_wave_gfs ; do [[ -d $dir ]] && rm -rf $dir done $LINK $FIX_DIR/* . @@ -81,10 +81,10 @@ cd ${pwd}/../util ||exit 8 done -#------------------------------ -#--add gfs_wafs link if on Dell -if [ $machine = dell -o $machine = hera ]; then -#------------------------------ +#----------------------------------- +#--add gfs_wafs link if checked out +if [ -d ${pwd}/gfs_wafs.fd ]; then +#----------------------------------- cd ${pwd}/../jobs ||exit 8 $LINK ../sorc/gfs_wafs.fd/jobs/* . cd ${pwd}/../parm ||exit 8 @@ -104,7 +104,10 @@ fi #------------------------------ cd ${pwd}/../jobs ||exit 8 $LINK ../sorc/gsi.fd/jobs/JGLOBAL_ANALYSIS . + $LINK ../sorc/gsi.fd/jobs/JGLOBAL_ANALCALC . + $LINK ../sorc/gsi.fd/jobs/JGLOBAL_ANALDIAG . $LINK ../sorc/gsi.fd/jobs/JGLOBAL_ENKF_SELECT_OBS . + $LINK ../sorc/gsi.fd/jobs/JGLOBAL_ENKF_ANALDIAG . $LINK ../sorc/gsi.fd/jobs/JGLOBAL_ENKF_INNOVATE_OBS . $LINK ../sorc/gsi.fd/jobs/JGLOBAL_ENKF_UPDATE . $LINK ../sorc/gsi.fd/jobs/JGDAS_ENKF_RECENTER . @@ -113,6 +116,8 @@ cd ${pwd}/../jobs ||exit 8 $LINK ../sorc/gsi.fd/jobs/JGDAS_ENKF_POST . cd ${pwd}/../scripts ||exit 8 $LINK ../sorc/gsi.fd/scripts/exglobal_analysis_fv3gfs.sh.ecf . + $LINK ../sorc/gsi.fd/scripts/exglobal_analcalc_fv3gfs.sh.ecf . + $LINK ../sorc/gsi.fd/scripts/exglobal_analdiag_fv3gfs.sh.ecf . $LINK ../sorc/gsi.fd/scripts/exglobal_innovate_obs_fv3gfs.sh.ecf . $LINK ../sorc/gsi.fd/scripts/exglobal_enkf_innovate_obs_fv3gfs.sh.ecf . $LINK ../sorc/gsi.fd/scripts/exglobal_enkf_update_fv3gfs.sh.ecf . @@ -183,6 +188,12 @@ cd ${pwd}/../ush ||exit 8 cd $pwd/../exec [[ -s global_fv3gfs.x ]] && rm -f global_fv3gfs.x $LINK ../sorc/fv3gfs.fd/NEMS/exe/global_fv3gfs.x . +if [ -d ../sorc/fv3gfs.fd/WW3/exec ]; then # Wave execs + for waveexe in ww3_gint ww3_grib ww3_grid ww3_multi ww3_ounf ww3_ounp ww3_outf ww3_outp ww3_prep ww3_prnc; do + [[ -s $waveexe ]] && rm -f $waveexe + $LINK ../sorc/fv3gfs.fd/WW3/exec/$waveexe . + done +fi [[ -s global_fv3gfs_ccpp.x ]] && rm -f global_fv3gfs_ccpp.x $LINK ../sorc/fv3gfs_ccpp.fd/NEMS/exe/global_fv3gfs_ccpp.x . @@ -190,7 +201,7 @@ $LINK ../sorc/fv3gfs_ccpp.fd/NEMS/exe/global_fv3gfs_ccpp.x . [[ -s gfs_ncep_post ]] && rm -f gfs_ncep_post $LINK ../sorc/gfs_post.fd/exec/ncep_post gfs_ncep_post -if [ $machine = dell -o $machine = hera ]; then +if [ -d ${pwd}/gfs_wafs.fd ]; then for wafsexe in wafs_awc_wafavn wafs_blending wafs_cnvgrib2 wafs_gcip wafs_makewafs wafs_setmissing; do [[ -s $wafsexe ]] && rm -f $wafsexe $LINK ../sorc/gfs_wafs.fd/exec/$wafsexe . @@ -254,7 +265,7 @@ cd ${pwd}/../sorc || exit 8 done - if [ $machine = dell -o $machine = hera ]; then + if [ -d ${pwd}/gfs_wafs.fd ]; then $SLINK gfs_wafs.fd/sorc/wafs_awc_wafavn.fd wafs_awc_wafavn.fd $SLINK gfs_wafs.fd/sorc/wafs_blending.fd wafs_blending.fd $SLINK gfs_wafs.fd/sorc/wafs_cnvgrib2.fd wafs_cnvgrib2.fd @@ -263,7 +274,6 @@ cd ${pwd}/../sorc || exit 8 $SLINK gfs_wafs.fd/sorc/wafs_setmissing.fd wafs_setmissing.fd fi - for prog in gdas2gldas.fd gldas2gdas.fd gldas_forcing.fd gldas_model.fd gldas_post.fd gldas_rst.fd ;do $SLINK gldas.fd/sorc/$prog $prog done @@ -283,5 +293,3 @@ fi exit 0 - - diff --git a/sorc/partial_build.sh b/sorc/partial_build.sh index 51ddd150bb..f61e0639c4 100755 --- a/sorc/partial_build.sh +++ b/sorc/partial_build.sh @@ -16,9 +16,7 @@ "Build_gfs_bufrsnd" \ "Build_fv3nc2nemsio" \ "Build_regrid_nemsio" \ - "Build_gfs_util" \ - "Build_prod_util" \ - "Build_grib_util") + "Build_gfs_util") # # function parse_cfg: read config file and retrieve the values diff --git a/ush/gfs_bfr2gpk.sh b/ush/gfs_bfr2gpk.sh index 8930521201..545d77cd68 100755 --- a/ush/gfs_bfr2gpk.sh +++ b/ush/gfs_bfr2gpk.sh @@ -49,7 +49,7 @@ SNOUTF = ${outfilbase}.snd SFOUTF = ${outfilbase}.sfc SNPRMF = sngfs.prm SFPRMF = sfgfs.prm -TIMSTN = 170/2100 +TIMSTN = 170/2150 r ex diff --git a/ush/hpssarch_gen.sh b/ush/hpssarch_gen.sh index fb7855a32c..8c7f94299b 100755 --- a/ush/hpssarch_gen.sh +++ b/ush/hpssarch_gen.sh @@ -136,6 +136,23 @@ if [ $type = "gfs" ]; then echo "${dirname}RESTART/*0000.sfcanl_data.tile5.nc " >>gfs_restarta.txt echo "${dirname}RESTART/*0000.sfcanl_data.tile6.nc " >>gfs_restarta.txt + #.................. + if [ $DO_WAVE = "YES" ]; then + + rm -rf gfswave.txt + touch gfswave.txt + + dirpath="gfswave.${PDY}/${cyc}/" + dirname="./${dirpath}" + + head="gfswave.t${cyc}z." + + #........................... + echo "${dirname}gridded/${head}* " >>gfswave.txt + echo "${dirname}station/${head}* " >>gfswave.txt + + fi + #----------------------------------------------------- fi ##end of gfs #----------------------------------------------------- @@ -233,6 +250,28 @@ if [ $type = "gdas" ]; then #.................. echo "${dirname}RESTART " >>gdas_restartb.txt + #.................. + if [ $DO_WAVE = "YES" ]; then + + rm -rf gdaswave.txt + touch gdaswave.txt + rm -rf gdaswave_restart.txt + touch gdaswave_restart.txt + + dirpath="gdaswave.${PDY}/${cyc}/" + dirname="./${dirpath}" + + head="gdaswave.t${cyc}z." + + #........................... + echo "${dirname}gridded/${head}* " >>gdaswave.txt + echo "${dirname}station/${head}* " >>gdaswave.txt + + echo "${dirname}restart/* " >>gdaswave_restart.txt + + fi + + #----------------------------------------------------- fi ##end of gdas #----------------------------------------------------- @@ -373,6 +412,5 @@ if [ $type = "enkfgdas" -o $type = "enkfgfs" ]; then fi ##end of enkfgdas or enkfgfs #----------------------------------------------------- - exit 0 diff --git a/ush/rocoto/setup_workflow.py b/ush/rocoto/setup_workflow.py index 2cb10d60c5..61b5b27434 100755 --- a/ush/rocoto/setup_workflow.py +++ b/ush/rocoto/setup_workflow.py @@ -42,10 +42,27 @@ def main(): print 'input arg: --expdir = %s' % repr(args.expdir) sys.exit(1) - gfs_steps = ['prep', 'anal', 'gldas', 'fcst', 'postsnd', 'post', 'awips', 'gempak', 'vrfy', 'metp', 'arch'] - hyb_steps = ['eobs', 'eomg', 'eupd', 'ecen', 'esfc', 'efcs', 'epos', 'earc'] + gfs_steps = ['prep', 'anal', 'analdiag', 'analcalc', 'gldas', 'fcst', 'postsnd', 'post', 'vrfy', 'arch'] + gfs_steps_gempak = ['gempak'] + gfs_steps_awips = ['awips'] + #hyb_steps = ['eobs', 'eomg', 'eupd', 'ecen', 'efcs', 'epos', 'earc'] + metp_steps = ['metp'] + wav_steps = ['waveinit', 'waveprep', 'wavepostsbs'] + #Implement additional wave jobs at later date + #wav_steps = ['waveinit', 'waveprep', 'wavepostsbs', 'wavepost', 'wavestat'] + #wav_steps_gempak = ['wavegempaksbs'] + #wav_steps_awips = ['waveawipssbs', 'waveawips'] +# From gfsv16b latest +# gfs_steps = ['prep', 'anal', 'gldas', 'fcst', 'postsnd', 'post', 'awips', 'gempak', 'vrfy', 'metp', 'arch'] + hyb_steps = ['eobs', 'ediag', 'eomg', 'eupd', 'ecen', 'esfc', 'efcs', 'epos', 'earc'] steps = gfs_steps + hyb_steps if _base.get('DOHYBVAR', 'NO') == 'YES' else gfs_steps + steps = steps + metp_steps if _base.get('DO_METP', 'NO') == 'YES' else steps + steps = steps + gfs_steps_gempak if _base.get('DO_GEMPAK', 'NO') == 'YES' else steps + steps = steps + gfs_steps_awips if _base.get('DO_AWIPS', 'NO') == 'YES' else steps + steps = steps + wav_steps if _base.get('DO_WAVE', 'NO') == 'YES' else steps + #steps = steps + wav_steps_gempak if _base.get('DO_GEMPAK', 'NO') == 'YES' else steps + #steps = steps + wav_steps_awips if _base.get('DO_AWIPS', 'NO') == 'YES' else steps dict_configs = wfu.source_configs(configs, steps) @@ -216,23 +233,40 @@ def get_gdasgfs_resources(dict_configs, cdump='gdas'): do_bufrsnd = base.get('DO_BUFRSND', 'NO').upper() do_gempak = base.get('DO_GEMPAK', 'NO').upper() do_awips = base.get('DO_AWIPS', 'NO').upper() + do_metp = base.get('DO_METP', 'NO').upper() do_gldas = base.get('DO_GLDAS', 'NO').upper() + do_wave = base.get('DO_WAVE', 'NO').upper() + do_wave_cdump = base.get('WAVE_CDUMP', 'BOTH').upper() reservation = base.get('RESERVATION', 'NONE').upper() #tasks = ['prep', 'anal', 'fcst', 'post', 'vrfy', 'arch'] - tasks = ['prep', 'anal'] + tasks = ['prep', 'anal', 'analcalc'] + if cdump in ['gdas']: + tasks += ['analdiag'] if cdump in ['gdas'] and do_gldas in ['Y', 'YES']: tasks += ['gldas'] + if cdump in ['gdas'] and do_wave in ['Y', 'YES'] and do_wave_cdump in ['GDAS', 'BOTH']: + #tasks += ['waveinit', 'waveprep', 'wavepostsbs', 'wavepost', 'wavestat'] + tasks += ['waveinit', 'waveprep', 'wavepostsbs'] - tasks += ['fcst', 'post', 'vrfy', 'metp', 'arch'] + tasks += ['fcst', 'post', 'vrfy', 'arch'] + if cdump in ['gfs'] and do_wave in ['Y', 'YES'] and do_wave_cdump in ['GFS', 'BOTH']: + #tasks += ['waveinit', 'waveprep', 'wavepostsbs', 'wavepost', 'wavestat'] + tasks += ['waveinit', 'waveprep', 'wavepostsbs'] if cdump in ['gfs'] and do_bufrsnd in ['Y', 'YES']: tasks += ['postsnd'] if cdump in ['gfs'] and do_gempak in ['Y', 'YES']: tasks += ['gempak'] + #if cdump in ['gfs'] and do_wave in ['Y', 'YES'] and do_gempak in ['Y', 'YES']: + # tasks += ['wavegempaksbs'] if cdump in ['gfs'] and do_awips in ['Y', 'YES']: tasks += ['awips'] + if cdump in ['gfs'] and do_metp in ['Y', 'YES']: + tasks += ['metp'] + #if cdump in ['gfs'] and do_wave in ['Y', 'YES'] and do_awips in ['Y', 'YES']: + # tasks += ['waveawipssbs', 'waveawips'] dict_resources = OrderedDict() @@ -273,9 +307,10 @@ def get_hyb_resources(dict_configs): dict_resources = OrderedDict() # These tasks can be run in either or both cycles - tasks1 = ['eobs', 'eomg', 'eupd'] if lobsdiag_forenkf in ['.T.', '.TRUE.']: - tasks1.remove('eomg') + tasks1 = ['eobs', 'ediag', 'eupd'] + else: + tasks1 = ['eobs', 'eomg', 'eupd'] if eupd_cyc in ['BOTH']: cdumps = ['gfs', 'gdas'] @@ -355,7 +390,10 @@ def get_gdasgfs_tasks(dict_configs, cdump='gdas'): do_bufrsnd = base.get('DO_BUFRSND', 'NO').upper() do_gempak = base.get('DO_GEMPAK', 'NO').upper() do_awips = base.get('DO_AWIPS', 'NO').upper() + do_metp = base.get('DO_METP', 'NO').upper() do_gldas = base.get('DO_GLDAS', 'NO').upper() + do_wave = base.get('DO_WAVE', 'NO').upper() + do_wave_cdump = base.get('WAVE_CDUMP', 'BOTH').upper() dumpsuffix = base.get('DUMP_SUFFIX', '') gridsuffix = base.get('SUFFIX', '') @@ -386,6 +424,34 @@ def get_gdasgfs_tasks(dict_configs, cdump='gdas'): dict_tasks['%sprep' % cdump] = task + # wave tasks in gdas or gfs or both + if do_wave_cdump in ['BOTH']: + cdumps = ['gfs', 'gdas'] + elif do_wave_cdump in ['GFS']: + cdumps = ['gfs'] + elif do_wave_cdump in ['GDAS']: + cdumps = ['gdas'] + + # waveinit + if do_wave in ['Y', 'YES'] and cdump in cdumps: + deps = [] + dep_dict = {'type': 'task', 'name': '%sprep' % cdump} + deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'cycleexist', 'condition': 'not', 'offset': '-06:00:00'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='or', dep=deps) + task = wfu.create_wf_task('waveinit', cdump=cdump, envar=envars, dependency=dependencies) + dict_tasks['%swaveinit' % cdump] = task + + # waveprep + if do_wave in ['Y', 'YES'] and cdump in cdumps: + deps = [] + dep_dict = {'type': 'task', 'name': '%swaveinit' % cdump} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + task = wfu.create_wf_task('waveprep', cdump=cdump, envar=envars, dependency=dependencies) + dict_tasks['%swaveprep' % cdump] = task + # anal deps = [] dep_dict = {'type': 'task', 'name': '%sprep' % cdump} @@ -400,6 +466,31 @@ def get_gdasgfs_tasks(dict_configs, cdump='gdas'): dict_tasks['%sanal' % cdump] = task + # analcalc + deps = [] + data = '&ROTDIR;/%s.@Y@m@d/@H/%s.t@Hz.loginc.txt' % (cdump, cdump) + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'task', 'name': '%sanal' % cdump} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='or', dep=deps) + task = wfu.create_wf_task('analcalc', cdump=cdump, envar=envars, dependency=dependencies) + + dict_tasks['%sanalcalc' % cdump] = task + + # analdiag + if cdump in ['gdas']: + deps = [] + data = '&ROTDIR;/%s.@Y@m@d/@H/%s.t@Hz.loginc.txt' % (cdump, cdump) + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type': 'task', 'name': '%sanal' % cdump} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='or', dep=deps) + task = wfu.create_wf_task('analdiag', cdump=cdump, envar=envars, dependency=dependencies) + + dict_tasks['%sanaldiag' % cdump] = task + # gldas if cdump in ['gdas'] and do_gldas in ['Y', 'YES']: deps = [] @@ -415,24 +506,26 @@ def get_gdasgfs_tasks(dict_configs, cdump='gdas'): # fcst deps = [] - data = '&ROTDIR;/%s.@Y@m@d/@H/%s.t@Hz.loginc.txt' % (cdump, cdump) - dep_dict = {'type': 'data', 'data': data} - deps.append(rocoto.add_dependency(dep_dict)) + #data = '&ROTDIR;/%s.@Y@m@d/@H/%s.t@Hz.loginc.txt' % (cdump, cdump) + #dep_dict = {'type': 'data', 'data': data} +# #deps.append(rocoto.add_dependency(dep_dict)) + if do_wave in ['Y', 'YES'] and cdump in cdumps: + dep_dict = {'type': 'task', 'name': '%swaveprep' % cdump} + deps.append(rocoto.add_dependency(dep_dict)) if cdump in ['gdas']: if do_gldas in ['Y', 'YES']: dep_dict = {'type': 'task', 'name': '%sgldas' % cdump} deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) else: - dep_dict = {'type': 'task', 'name': '%sanal' % cdump} + dep_dict = {'type': 'task', 'name': '%sanalcalc' % cdump} deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'cycleexist', 'condition': 'not', 'offset': '-06:00:00'} - deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep_condition='or', dep=deps) + # dep_dict = {'type': 'cycleexist', 'condition': 'not', 'offset': '-06:00:00'} + # deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='or',dep=deps) elif cdump in ['gfs']: dep_dict = {'type': 'task', 'name': '%sanal' % cdump} deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep_condition='or',dep=deps) + dependencies = rocoto.create_dependency(dep_condition='and',dep=deps) task = wfu.create_wf_task('fcst', cdump=cdump, envar=envars, dependency=dependencies) dict_tasks['%sfcst' % cdump] = task @@ -457,6 +550,83 @@ def get_gdasgfs_tasks(dict_configs, cdump='gdas'): dict_tasks['%spost' % cdump] = task + # wavepostsbs + if do_wave in ['Y', 'YES'] and cdump in cdumps: + deps = [] + data = '&ROTDIR;/%swave.@Y@m@d/@H/rundata/%swave.out_grd.glo_10m.@Y@m@d.@H0000' % (cdump,cdump) + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + data = '&ROTDIR;/%swave.@Y@m@d/@H/rundata/%swave.out_grd.aoc_9km.@Y@m@d.@H0000' % (cdump,cdump) + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + data = '&ROTDIR;/%swave.@Y@m@d/@H/rundata/%swave.out_grd.ant_9km.@Y@m@d.@H0000' % (cdump,cdump) + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + task = wfu.create_wf_task('wavepostsbs', cdump=cdump, envar=envars, dependency=dependencies) + dict_tasks['%swavepostsbs' % cdump] = task + + # wavegempaksbs + #if do_wave in ['Y', 'YES'] and do_gempak in ['Y', 'YES'] and cdump in ['gfs']: + # deps = [] + # data = '&ROTDIR;/%swave.@Y@m@d/@H/%swave.t@Hz.glo_10m.10m.f000.grib2' % (cdump,cdump) + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # data = '&ROTDIR;/%swave.@Y@m@d/@H/%swave.t@Hz.aoc_9km.9km.f000.grib2' % (cdump,cdump) + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # data = '&ROTDIR;/%swave.@Y@m@d/@H/%swave.t@Hz.ant_9km.9km.f000.grib2' % (cdump,cdump) + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + # task = wfu.create_wf_task('wavegempaksbs', cdump=cdump, envar=envars, dependency=dependencies) + # dict_tasks['%swavegempaksbs' % cdump] = task + + # waveawipssbs + #if do_wave in ['Y', 'YES'] and do_awips in ['Y', 'YES'] and cdump in ['gfs']: + # deps = [] + # data = '&ROTDIR;/%swave.@Y@m@d/@H/%swave.t@Hz.glo_10m.10m.f000.grib2' % (cdump,cdump) + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # data = '&ROTDIR;/%swave.@Y@m@d/@H/%swave.t@Hz.aoc_9km.9km.f000.grib2' % (cdump,cdump) + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # data = '&ROTDIR;/%swave.@Y@m@d/@H/%swave.t@Hz.ant_9km.9km.f000.grib2' % (cdump,cdump) + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + # task = wfu.create_wf_task('waveawipssbs', cdump=cdump, envar=envars, dependency=dependencies) + # dict_tasks['%swaveawipssbs' % cdump] = task + + # wavepost + #if do_wave in ['Y', 'YES'] and cdump in cdumps: + # deps = [] + # dep_dict = {'type':'task', 'name':'%sfcst' % cdump} + # deps.append(rocoto.add_dependency(dep_dict)) + # dep_dict = {'type':'task', 'name':'%swavepostsbs' % cdump} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + # task = wfu.create_wf_task('wavepost', cdump=cdump, envar=envars, dependency=dependencies) + # dict_tasks['%swavepost' % cdump] = task + + # waveawips + #if do_wave in ['Y', 'YES'] and do_awips in ['Y', 'YES'] and cdump in ['gfs']: + # deps = [] + # dep_dict = {'type':'task', 'name':'%swavepost' % cdump} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep=deps) + # task = wfu.create_wf_task('waveawips', cdump=cdump, envar=envars, dependency=dependencies) + # dict_tasks['%swaveawips' % cdump] = task + + # wavestat + #if do_wave in ['Y', 'YES'] and cdump in cdumps: + # deps = [] + # dep_dict = {'type':'task', 'name':'%swavepost' % cdump} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep=deps) + # task = wfu.create_wf_task('wavestat', cdump=cdump, envar=envars, dependency=dependencies) + # dict_tasks['%swavestat' % cdump] = task + # vrfy deps = [] dep_dict = {'type': 'metatask', 'name': '%spost' % cdump} @@ -467,7 +637,7 @@ def get_gdasgfs_tasks(dict_configs, cdump='gdas'): dict_tasks['%svrfy' % cdump] = task # metp - if cdump in ['gfs']: + if cdump in ['gfs'] and do_metp in ['Y', 'YES']: deps = [] dep_dict = {'type':'metatask', 'name':'%spost' % cdump} deps.append(rocoto.add_dependency(dep_dict)) @@ -482,8 +652,8 @@ def get_gdasgfs_tasks(dict_configs, cdump='gdas'): metatask='metp', varname=varname1, varval=varval1) dict_tasks['%smetp' % cdump] = task + #postsnd if cdump in ['gfs'] and do_bufrsnd in ['Y', 'YES']: - #postsnd deps = [] dep_dict = {'type': 'task', 'name': '%sfcst' % cdump} deps.append(rocoto.add_dependency(dep_dict)) @@ -492,8 +662,8 @@ def get_gdasgfs_tasks(dict_configs, cdump='gdas'): dict_tasks['%spostsnd' % cdump] = task + # awips if cdump in ['gfs'] and do_awips in ['Y', 'YES']: - # awips deps = [] data = '&ROTDIR;/%s.@Y@m@d/@H/%s.t@Hz.sfluxgrb#dep#.grib2.idx' % (cdump, cdump) dep_dict = {'type': 'data', 'data': data} @@ -513,8 +683,8 @@ def get_gdasgfs_tasks(dict_configs, cdump='gdas'): dict_tasks['%sawips' % cdump] = task + # gempak if cdump in ['gfs'] and do_gempak in ['Y', 'YES']: - # gempak deps = [] dep_dict = {'type': 'metatask', 'name': '%spost' % cdump} deps.append(rocoto.add_dependency(dep_dict)) @@ -613,12 +783,22 @@ def get_hyb_tasks(dict_configs, cycledef='enkf'): dict_tasks['%seomn' % cdump] = task + # ediag + else: + deps = [] + dep_dict = {'type': 'task', 'name': '%seobs' % cdump} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + task = wfu.create_wf_task('ediag', cdump=cdump, envar=envars1, dependency=dependencies, cycledef=cycledef) + + dict_tasks['%sediag' % cdump] = task + # eupd deps = [] if lobsdiag_forenkf in ['.F.', '.FALSE.']: dep_dict = {'type': 'metatask', 'name': '%seomn' % cdump} else: - dep_dict = {'type': 'task', 'name': '%seobs' % cdump} + dep_dict = {'type': 'task', 'name': '%sediag' % cdump} deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep=deps) task = wfu.create_wf_task('eupd', cdump=cdump, envar=envars1, dependency=dependencies, cycledef=cycledef) @@ -636,7 +816,7 @@ def get_hyb_tasks(dict_configs, cycledef='enkf'): data = '&ROTDIR;/%s.@Y@m@d/@H/%s.t@Hz.loganl.txt' % (cdump, cdump) dep_dict = {'type': 'data', 'data': data} deps1.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'task', 'name': '%sanal' % cdump} + dep_dict = {'type': 'task', 'name': '%sanalcalc' % cdump} deps1.append(rocoto.add_dependency(dep_dict)) dependencies1 = rocoto.create_dependency(dep_condition='or', dep=deps1) @@ -662,7 +842,7 @@ def get_hyb_tasks(dict_configs, cycledef='enkf'): data = '&ROTDIR;/%s.@Y@m@d/@H/%s.t@Hz.loganl.txt' % (cdump, cdump) dep_dict = {'type': 'data', 'data': data} deps1.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type': 'task', 'name': '%sanal' % cdump} + dep_dict = {'type': 'task', 'name': '%sanalcalc' % cdump} deps1.append(rocoto.add_dependency(dep_dict)) dependencies1 = rocoto.create_dependency(dep_condition='or', dep=deps1) @@ -918,7 +1098,7 @@ def create_xml(dict_configs): dict_hyb_tasks = get_hyb_tasks(dict_configs) # Removes &MEMORY_JOB_DUMP post mortem from hyb tasks - hyp_tasks = {'gdaseobs':'gdaseobs', 'gdaseomg':'gdaseomn', 'gdaseupd':'gdaseupd','gdasecen':'gdasecmn','gdasesfc':'gdasesfc','gdasefcs':'gdasefmn','gdasepos':'gdasepmn','gdasearc':'gdaseamn'} + hyp_tasks = {'gdaseobs':'gdaseobs', 'gdasediag':'gdasediag', 'gdaseomg':'gdaseomn', 'gdaseupd':'gdaseupd','gdasecen':'gdasecmn','gdasesfc':'gdasesfc','gdasefcs':'gdasefmn','gdasepos':'gdasepmn','gdasearc':'gdaseamn'} for each_task, each_resource_string in dict_hyb_resources.iteritems(): #print each_task,hyp_tasks[each_task] #print dict_hyb_tasks[hyp_tasks[each_task]] diff --git a/ush/rocoto/setup_workflow_fcstonly.py b/ush/rocoto/setup_workflow_fcstonly.py index f5b29ab252..8556369827 100755 --- a/ush/rocoto/setup_workflow_fcstonly.py +++ b/ush/rocoto/setup_workflow_fcstonly.py @@ -27,8 +27,8 @@ import rocoto import workflow_utils as wfu - -taskplan = ['getic', 'fv3ic', 'fcst', 'post', 'vrfy', 'metp', 'arch'] +#taskplan = ['getic', 'fv3ic', 'waveinit', 'waveprep', 'fcst', 'post', 'wavepostsbs', 'wavegempaksbs', 'waveawipssbs', 'wavepost', 'waveawips', 'wavestat', 'vrfy', 'metp', 'arch'] +taskplan = ['getic', 'fv3ic', 'waveinit', 'waveprep', 'fcst', 'post', 'wavepostsbs', 'vrfy', 'metp', 'arch'] def main(): parser = ArgumentParser(description='Setup XML workflow and CRONTAB for a forecast only experiment.', formatter_class=ArgumentDefaultsHelpFormatter) @@ -156,6 +156,11 @@ def get_resources(dict_configs, cdump='gdas'): reservation = base.get('RESERVATION', 'NONE').upper() scheduler = wfu.get_scheduler(machine) + do_wave = base.get('DO_WAVE', 'NO').upper() + do_gempak = base.get('DO_GEMPAK', 'NO').upper() + do_awips = base.get('DO_AWIPS', 'NO').upper() + do_metp = base.get('DO_METP', 'NO').upper() + for task in taskplan: cfg = dict_configs[task] @@ -225,6 +230,12 @@ def get_workflow(dict_configs, cdump='gdas'): envars.append(rocoto.create_envar(name='PDY', value='@Y@m@d')) envars.append(rocoto.create_envar(name='cyc', value='@H')) + base = dict_configs['base'] + do_wave = base.get('DO_WAVE', 'NO').upper() + do_gempak = base.get('DO_GEMPAK', 'NO').upper() + do_awips = base.get('DO_AWIPS', 'NO').upper() + do_metp = base.get('DO_METP', 'NO').upper() + tasks = [] # getics @@ -247,7 +258,7 @@ def get_workflow(dict_configs, cdump='gdas'): tasks.append(task) tasks.append('\n') - # chgres + # chgres fv3ic deps = [] data = '&ICSDIR;/@Y@m@d@H/&CDUMP;/&CDUMP;.@Y@m@d/@H/siganl.&CDUMP;.@Y@m@d@H' dep_dict = {'type':'data', 'data':data} @@ -282,6 +293,22 @@ def get_workflow(dict_configs, cdump='gdas'): tasks.append(task) tasks.append('\n') + # waveinit + if do_wave in ['Y', 'YES']: + task = wfu.create_wf_task('%swaveinit', cdump=cdump, envar=envars) + tasks.append(task) + tasks.append('\n') + + # waveprep + if do_wave in ['Y', 'YES']: + deps = [] + dep_dict = {'type': 'task', 'name': '%swaveinit' % cdump} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + task = wfu.create_wf_task('waveprep', cdump=cdump, envar=envars, dependency=dependencies) + tasks.append(task) + tasks.append('\n') + # fcst deps = [] data = '&ICSDIR;/@Y@m@d@H/&CDUMP;/&CASE;/INPUT/gfs_data.tile6.nc' @@ -290,6 +317,9 @@ def get_workflow(dict_configs, cdump='gdas'): data = '&ICSDIR;/@Y@m@d@H/&CDUMP;/&CASE;/INPUT/sfc_data.tile6.nc' dep_dict = {'type':'data', 'data':data} deps.append(rocoto.add_dependency(dep_dict)) + if do_wave in ['Y', 'YES']: + dep_dict = {'type': 'task', 'name': '%swaveprep' % cdump} + deps.append(rocoto.add_dependency(dep_dict)) dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) task = wfu.create_wf_task('fcst', cdump=cdump, envar=envars, dependency=dependencies) tasks.append(task) @@ -313,6 +343,89 @@ def get_workflow(dict_configs, cdump='gdas'): tasks.append(task) tasks.append('\n') + # wavepostsbs + if do_wave in ['Y', 'YES']: + deps = [] + data = '&ROTDIR;/%swave.@Y@m@d/@H/rundata/%swave.out_grd.glo_10m.@Y@m@d.@H0000' % (cdump,cdump) + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + data = '&ROTDIR;/%swave.@Y@m@d/@H/rundata/%swave.out_grd.aoc_9km.@Y@m@d.@H0000' % (cdump,cdump) + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + data = '&ROTDIR;/%swave.@Y@m@d/@H/rundata/%swave.out_grd.ant_9km.@Y@m@d.@H0000' % (cdump,cdump) + dep_dict = {'type': 'data', 'data': data} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep=deps) + task = wfu.create_wf_task('wavepostsbs', cdump=cdump, envar=envars, dependency=dependencies) + tasks.append(task) + tasks.append('\n') + + # wavepost + #if do_wave in ['Y', 'YES']: + # deps = [] + # dep_dict = {'type':'task', 'name':'%sfcst' % cdump} + # deps.append(rocoto.add_dependency(dep_dict)) + # dep_dict = {'type':'task', 'name':'%swavepostsbs' % cdump} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep=deps) + # task = wfu.create_wf_task('wavepost', cdump=cdump, envar=envars, dependency=dependencies) + # tasks.append(task) + # tasks.append('\n') + + # waveawips + #if do_wave in ['Y', 'YES'] and do_awips in ['Y', 'YES']: + # deps = [] + # dep_dict = {'type':'task', 'name':'%swavepost' % cdump} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep=deps) + # task = wfu.create_wf_task('waveawips', cdump=cdump, envar=envars, dependency=dependencies) + # tasks.append(task) + # tasks.append('\n') + + # wavestat + #if do_wave in ['Y', 'YES']: + # deps = [] + # dep_dict = {'type':'task', 'name':'%swavepost' % cdump} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep=deps) + # task = wfu.create_wf_task('wavestat', cdump=cdump, envar=envars, dependency=dependencies) + # tasks.append(task) + # tasks.append('\n') + + # wavegempaksbs + #if do_wave in ['Y', 'YES'] and do_gempak in ['Y', 'YES']: + # deps = [] + # data = '&ROTDIR;/wave.@Y@m@d/@H/wave.t@Hz.glo_10m.10m.f000.grib2' + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # data = '&ROTDIR;/wave.@Y@m@d/@H/wave.t@Hz.aoc_9km.9km.f000.grib2' + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # data = '&ROTDIR;/wave.@Y@m@d/@H/wave.t@Hz.ant_9km.9km.f000.grib2' + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep=deps) + # task = wfu.create_wf_task('wavegempaksbs', cdump=cdump, envar=envars, dependency=dependencies) + # tasks.append(task) + # tasks.append('\n') + + # waveawipssbs + #if do_wave in ['Y', 'YES'] and do_awips in ['Y', 'YES']: + # deps = [] + # data = '&ROTDIR;/wave.@Y@m@d/@H/wave.t@Hz.glo_10m.10m.f000.grib2' + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # data = '&ROTDIR;/wave.@Y@m@d/@H/wave.t@Hz.aoc_9km.9km.f000.grib2' + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # data = '&ROTDIR;/wave.@Y@m@d/@H/wave.t@Hz.ant_9km.9km.f000.grib2' + # dep_dict = {'type': 'data', 'data': data} + # deps.append(rocoto.add_dependency(dep_dict)) + # dependencies = rocoto.create_dependency(dep=deps) + # task = wfu.create_wf_task('waveawipssbs', cdump=cdump, envar=envars, dependency=dependencies) + # tasks.append(task) + # tasks.append('\n') + # vrfy deps = [] dep_dict = {'type':'metatask', 'name':'%spost' % cdump} @@ -323,20 +436,21 @@ def get_workflow(dict_configs, cdump='gdas'): tasks.append('\n') # metp - deps = [] - dep_dict = {'type':'metatask', 'name':'%spost' % cdump} - deps.append(rocoto.add_dependency(dep_dict)) - dep_dict = {'type':'task', 'name':'%sarch' % cdump, 'offset':'-&INTERVAL;'} - deps.append(rocoto.add_dependency(dep_dict)) - dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) - metpcase = rocoto.create_envar(name='METPCASE', value='#metpcase#') - metpenvars = envars + [metpcase] - varname1 = 'metpcase' - varval1 = 'g2g1 g2o1 pcp1' - task = wfu.create_wf_task('metp', cdump=cdump, envar=metpenvars, dependency=dependencies, - metatask='metp', varname=varname1, varval=varval1) - tasks.append(task) - tasks.append('\n') + if do_metp in ['Y', 'YES']: + deps = [] + dep_dict = {'type':'metatask', 'name':'%spost' % cdump} + deps.append(rocoto.add_dependency(dep_dict)) + dep_dict = {'type':'task', 'name':'%sarch' % cdump, 'offset':'-&INTERVAL;'} + deps.append(rocoto.add_dependency(dep_dict)) + dependencies = rocoto.create_dependency(dep_condition='and', dep=deps) + metpcase = rocoto.create_envar(name='METPCASE', value='#metpcase#') + metpenvars = envars + [metpcase] + varname1 = 'metpcase' + varval1 = 'g2g1 g2o1 pcp1' + task = wfu.create_wf_task('metp', cdump=cdump, envar=metpenvars, dependency=dependencies, + metatask='metp', varname=varname1, varval=varval1) + tasks.append(task) + tasks.append('\n') # arch deps = [] diff --git a/ush/wave_ens_bull.sh b/ush/wave_ens_bull.sh new file mode 100755 index 0000000000..0aac9d4fa2 --- /dev/null +++ b/ush/wave_ens_bull.sh @@ -0,0 +1,261 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_ens_bull.sh +# Script description: Create buoy bulletin for NCEP Global Wave Ensemble +# +# Author: Jose-Henrique Alves Org: NCEP/EMC Date: 2014-01-16 +# Abstract: Creates bulletin for NCEP Global Wave Ensemble using grib2 data. +# Values at buoy locations are extracted using wgrib2 bi-linear +# interpolation (-new_grid) and requires IPOLATES lib. +# +# Script history log: +# 2019-05-06 J-Henrique Alves First Version. +# 2019-11-02 J-Henrique Alves Ported to global-workflow. +# +# $Id$ +# +# Attributes: +# Language: Bourne-again (BASH) shell +# Machine: WCOSS-DELL-P3 +# +# Requirements: +# - wgrib2 with IPOLATES library +# +################################################################################ +# +# 0. Preparations +# 0.a Basic modes of operation +# + seton='-xa' + setoff='+xa' + set $seton + + echo -e '\n ******************************************' + echo ' *** WAVE ENSEMBLE BUOY BULLETIN SCRIPT ***' + echo -e ' ******************************************\n' + echo " Starting at : `date`" +# +# 0.b External dependencies and paths +# + export wgrib2=$utilexec/wgrib2 + scripname=wave_ens_bull.sh +# +# 0.b Date and time stuff +# + export YMD=$PDY + export YMDH=${PDY}${cyc} + export tcycz=t${cyc}z +# +# 0.c Buoy location parameters (from stdin) +# + blon=$1 + blat=$2 + bnom=$3 + bfil=$4 +# +# 0.d Plumbing +# + BULLdir=${bnom}_bull + rm -rf $BULLdir + mkdir -p $BULLdir + err=$? + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '******************************************************* ' + echo " FATAL ERROR: NOT ABLE TO CREATE TEMP DIR ${BULLdir} " + echo '******************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + ../postmsg "$jlogfile" "FATAL ERROR in ${scripname}: Could not create temp directory" + exit 1 + fi + + cd ${BULLdir} +# +# 0.e Output file names +# + bfil="${COMPONENTwave}.${bnom}.bull" + tfil="${COMPONENTwave}.${bnom}.ts" +# +# 1. Prepare input data +# +# 1.a Interpolate from gribfile at model res to high resolution at buoy location +# (wgrib2 + IPOLATES -> bi-linear) +# + $utilexec/wgrib2 ../gribfile -new_grid_winds earth \ + -new_grid_interpolation bilinear -new_grid latlon \ + ${blon}:2:.01 ${blat}:2:.01 grbint.${bnom} \ + 1> buoy_interp.out 2>&1 +# + if ! [ -f grbint.${bnom} ] + then + set +x + echo ' ' + echo '******************************************************* ' + echo " FATAL ERROR: FAILED TO CREATE FILE grbint.${bnom} " + echo '******************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + ../postmsg "$jlogfile" "FATAL ERROR creating grbint.${bnom} in ${scripname}" + exit 2 + fi +# +# 1.b Extract parameters at buoy locations from higher res interpolated file +# + valpdy=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match mean -vt \ + | sed 's/[,=]/ /g' | awk '{print $NF}' | cut -c1-8`) + vald=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match mean -vt \ + | sed 's/[,=]/ /g' | awk '{print $NF}' | cut -c7-8`) + valt=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match mean -vt \ + | sed 's/[,=]/ /g' | awk '{print $NF}' | cut -c9-10`) + hsb=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match mean -lon \ + ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + hspb=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match spread \ + -lon ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + tpb=(`$utilexec/wgrib2 grbint.${bnom} -match PERPW -match mean -lon \ + ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + tspb=(`$utilexec/wgrib2 grbint.${bnom} -match PERPW -match spread \ + -lon ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + ub=(`$utilexec/wgrib2 grbint.${bnom} -match WIND -match mean -lon \ + ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + usb=(`$utilexec/wgrib2 grbint.${bnom} -match WIND -match spread -lon \ + ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + p1b=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match 'prob >0.6' \ + -lon ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + p2b=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match 'prob >1' \ + -lon ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + p3b=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match 'prob >2' \ + -lon ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + p4b=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match 'prob >5.5' \ + -lon ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + p5b=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match 'prob >7' \ + -lon ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) + p6b=(`$utilexec/wgrib2 grbint.${bnom} -match HTSGW -match 'prob >9' \ + -lon ${blon} ${blat} | sed 's/[,=]/ /g' | awk '{print $NF}'`) +# +# Length of parameter vectors +# + tlen=`echo ${hsb[@]} | wc -w` +# +# Check for error in reading parameters from interpolated file +# + if [ ! $vald ] || [ ! valt ] || [ ! hsb ] || [ ! hspb ] || [ ! tpb ] || \ + [ ! tspb ] || [ ! ub ] || [ ! usb ] || [ ! p1b ] || [ ! p2b ] || \ + [ ! p3b ] || [ ! p4b ] || [ ! p5b ] || [ ! p6b ] + then + set +x + echo ' ' + echo '******************************************************* ' + echo " FATAL ERROR: FAILED TO READ PARAMS FROM grbint${bnom} " + echo '******************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + ../postmsg "$jlogfile" "FATAL ERROR reading parameters from grbint.${bnom} in ${scripname}" + exit 3 + fi +# +# Warning if any parameter has UNDEF value +# + UNDF=9.999e+20 + UNDFCHK=`echo ${hsb[@]} ${hspb[@]} ${tpb[@]} ${tspb[@]} ${ub[@]} ${usb[@]} \ + ${p1b[@]} ${p2b[@]} ${p3b[@]} ${p4b[@]} ${p5b[@]} ${p6b[@]}` + if [ `echo $UNDFCHK | grep $UNDF | cut -c1` ] + then + set +x + echo ' ' + echo '******************************************************* ' + echo " WARNING: PARAMETER IS UNDEFINED IN grbint.${bnom} " + echo '******************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + ../postmsg "$jlogfile" "WARNING: parameter is UNDEFINED in grbint.${bnom} in ${scripname}" + fi +# +# 2. Generate bulletin +# + printf "\n Location : "$bnom" ("$blat"N "$blon"W)\n" > $bfil + printf " Model : NCEP Global Wave Ensemble System (${COMPONENTwave})\n" >> $bfil + printf " Cycle : "$PDY" "$cycle" UTC\n" >> $bfil + printf "\n+-------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+\n" >> $bfil + printf "| day | Hs avg | Hs spr | Tp avg | Tp spr | U10avg | U10spr | P(Hs>) | P(Hs>) | P(Hs>) | P(Hs>) | P(Hs>) | P(Hs>) |\n" >> $bfil + printf "| hour | (m) | (m) | (s) | (m) | (m/s) | (m/s) | 1.00m | 2.00m | 3.00m | 5.50m | 7.00m | 9.00m |\n" >> $bfil + printf "+-------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+\n" >> $bfil + + for (( it=1; it<=$tlen; it++ )) + do + tdum=`expr ${valt[$it-1]} / 1` + ddum=`expr ${vald[$it-1]} / 1` + printf '| %2.2i %2.2i' $ddum $tdum >> $bfil + printf ' | %5.2f ' \ + ${hsb[$it-1]:0:4} \ + ${hspb[$it-1]:0:4} \ + ${tpb[$it-1]:0:4} \ + ${tspb[$it-1]:0:4} \ + ${ub[$it-1]:0:4} \ + ${usb[$it-1]:0:4} \ + ${p1b[$it-1]:0:4} \ + ${p2b[$it-1]:0:4} \ + ${p3b[$it-1]:0:4} \ + ${p4b[$it-1]:0:4} \ + ${p5b[$it-1]:0:4} \ + ${p6b[$it-1]:0:4} >> $bfil + printf ' |\n' >> $bfil + done + + printf "+-------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+--------+\n" >> $bfil + printf " Hs : Significant wave height\n" >> $bfil + printf " Tp : Peak period\n" >> $bfil + printf " U10 : Wind speed at a height of 10m above the surface\n" >> $bfil + printf " avg : Average of ensemble members\n" >> $bfil + printf " spr : Spread (standard deviation) of ensemble members\n" >> $bfil + printf " P(Hs >): Probability of Hs exceeding given threshold\n" >> $bfil + printf " NOAA/NWS/NCEP Marine Modeling and Analysis Branch, $PDY" >> $bfil +# +# 2.b Create time series output +# + printf " date hour Hs avg Hs spr Tp avg Tp spr U10avg U10spr \n" >> $tfil + printf " (m) (m) (s) (s) (m/s) (m/s) \n" >> $tfil + printf " ----------------------------------------------------- \n" >> $tfil + for (( it=1; it<=$tlen; it++ )) + do + tdum=`expr ${valt[$it-1]} / 1` + printf ' %8.8i %2.2i' ${valpdy[$it-1]} $tdum >> $tfil + printf ' %5.2f ' \ + ${hsb[$it-1]:0:4} \ + ${hspb[$it-1]:0:4} \ + ${tpb[$it-1]:0:4} \ + ${tspb[$it-1]:0:4} \ + ${ub[$it-1]:0:4} \ + ${usb[$it-1]:0:4} >> $tfil + printf '\n' >> $tfil + done +# +# 2.c Check for errors in creating bulletin file +# + if [ -f ${bfil} ] && [ -f ${tfil} ] + then + echo -e "\n ${COMPONENTwave} bulletin and ts-file created for location ${bnom}.\n" + else + set +x + echo ' ' + echo '******************************************************* ' + echo '*** FATAL ERROR: BULL/TS FILES AT ${bnom} NOT FOUND ***' + echo '******************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + ../postmsg "$jlogfile" "FATAL ERROR : BULL/TS FILES NOT FOUND" + exit 4 + fi +# +# 3. Copy and Cleanup +# + mv -f ${bfil} ../. + mv -f ${tfil} ../. + rm -rf ${bnom}_bull + +# End of buoy bulletin script diff --git a/ush/wave_ens_stats.sh b/ush/wave_ens_stats.sh new file mode 100755 index 0000000000..c167c608a4 --- /dev/null +++ b/ush/wave_ens_stats.sh @@ -0,0 +1,254 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_ens_stats.sh +# Script description: Create statists (means etc) from wave ensemble data +# +# Author: Jose-Henrique Alves Org: NCEP/EMC Date: 2014-01-16 +# Abstract: Creates bulletin for NCEP Global Wave Ensemble using grib2 data. +# Values at buoy locations are extracted using wgrib2 bi-linear +# interpolation (-new_grid) and requires IPOLATES lib. +# +# Script history log: +# 2014-01-16 J-Henrique Alves First Version. +# 2019-11-02 J-Henrique Alves Ported to global-workflow. +# +# $Id$ +# +# Attributes: +# Language: Bourne-again (BASH) shell +# Machine: WCOSS-DELL-P3 +# +# Requirements: +# - wgrib2 with IPOLATES library +# +################################################################################ +# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - --+ + + ++ +# 0. Preparations +# 0.a Basic modes of operation +# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - --+ + + ++ +# + seton='-xa' + setoff='+xa' + set $seton + + para=$1 + prepar=`echo $para | rev | cut -c2- | rev` #Part prefix (assumes 1 digit index) + paridx=`echo $para | rev | cut -c-1` #Part index (assumes 1 digit index) + +# Number of grib records + ngrib=$2 + +# Number of ensemble members + nmembn=`echo ${membn} | wc -w` + +# Forecast range + fhour=$3 + + mkdir -p tmp_${para} + cd tmp_${para} + +# 0.b Set general parameter settings + + scale=' ' + case $prepar in + HTSG) ascale=(0.60 1.00 2.00 3.00 4.00 5.50 7.00 9.00) ; + scale=${ascale[@]} + npart=0 ; + nip='hs' ; + nnip=${nip} ; + parcode='10 0 3' ;; + PERP) ascale=(5.0 7.0 9.0 11.0 13.0 15.0 17.0 19.0) ; + scale=${ascale[@]} + npart=0 ; + nip='tp' ; + nnip=${nip} ; + parcode='10 0 11' ;; + DIRP) ascale='0' ; + scale=${ascale[@]} + npart=0 ; + nip='pdir' ; + nnip=${nip} ; + parcode='10 0 10' ;; + WIN) ascale=(3.60 5.65 8.74 11.31 14.39 17.48 21.07 24.67) ; + scale=${ascale[@]} + npart=0 ; + nip='wnd' ; + nnip=${nip} ; + parcode='0 2 1' ;; + WDI) ascale='0' ; + scale=${ascale[@]} + npart=0 ; + nip='wnddir' ; + nnip=${nip} ; + parcode='0 2 0' ;; + WVHG) ascale=(0.60 1.00 2.00 3.00 4.00 5.50 7.00 9.00) ; + scale=${ascale[@]} + npart=0 ; + nip='wshs' ; + nnip=${nip} ; + parcode='10 0 5' ;; + WVPE) ascale=(5.0 7.0 9.0 11.0 13.0 15.0 17.0 19.0) ; + scale=${ascale[@]} + npart=0 ; + nip='wstp' ; + nnip=${nip} ; + parcode='10 0 6' ;; + WVDI) ascale='0' ; + scale=${ascale[@]} + npart=0 ; + nip='wsdir' ; + nnip=${nip} ; + parcode='10 0 4' ;; + SWELL) ascale=(0.60 1.00 2.00 3.00 4.00 5.50 7.00 9.00) ; + scale=${ascale[@]} + npart=1 ; + nip='hswell' ; + nnip="${nip}"$paridx ; + parcode='10 0 8' ;; + SWPER) ascale=(5.0 7.0 9.0 11.0 13.0 15.0 17.0 19.0) ; + scale=${ascale[@]} + npart=1 ; + nip='tswell' ; + nnip="${nip}"$paridx ; + parcode='10 0 9' ;; + *) ascale=${tsscale[@]} ; scale=${ascale[$plev]} ; parcode=' ' ;; + esac + +# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - --+ + + ++ +# 1. Compute mean, spread and probability of exceedence +# - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - --+ + + ++ +# +# set $seton + + rm -f mean.t${cyc}z.grib2 spread.t${cyc}z.grib2 probab.t${cyc}z.grib2 + + nmemb=${nmembn} + nmembm1=`expr ${nmemb} - 1` +# +# 1.a Create list of combined ensemble member numbers (starting from 00 = NCEP control run) +# + memb=`seq -w 0 ${nmembm1}` +# + valtime=`$NDATE ${fhour} ${YMDH}` + + mkdir -p ${valtime} + cd ${valtime} + + if [ $fhour -eq 0 ] ; then + ihr='anl' + hhh='000' + elif [ $fhour -lt 10 ] ; then + ihr=${fhour}' hour' + hhh='00'$fhour + elif [ $fhour -lt 100 ] ; then + ihr=${fhour}' hour' + hhh='0'$fhour + elif [ $fhour -ge 100 ] ; then + ihr=${fhour}' hour' + hhh=$fhour + fi +# + rm -f gwes_stats.inp data_* +# +# 1.b Loop through members + nme=0 +# while [ ${nme} -lt ${nmemb} ] + for im in $membn + do + + infile=../../${para}_${im}.t${cyc}z.grib2 + if [ "${im}" = "00" ] + then + +# 1.b.1 Generate input file for gwes_stats + echo $YMDH $hhh $nnip $parcode > gwes_stats.inp + #echo $YMDH $ngrib $dtgh $nnip $parcode > gwes_stats.inp + echo ${nmemb} >> gwes_stats.inp + echo $memb >> gwes_stats.inp + echo ${scale[@]} | wc -w >> gwes_stats.inp + echo ${scale[@]} >> gwes_stats.inp + +# 1.b.2 Get grid dimension for input grib file and pass to fortran code + nlola=`$WGRIB2 ${infile} -grid -d 1 | sed 's/(/ /g' | sed 's/)/ /g' | sed '2!d' | awk '{print $3,$5}'` + rdlon=`$WGRIB2 ${infile} -grid -d 1 | sed 's/(/ /g' | sed 's/)/ /g' | sed '4!d' | awk '{print $2,$4,$6}'` + rdlat=`$WGRIB2 ${infile} -grid -d 1 | sed 's/(/ /g' | sed 's/)/ /g' | sed '3!d' | awk '{print $2,$4,$6}'` + echo ${nlola} >> gwes_stats.inp + echo ${rdlon} >> gwes_stats.inp + echo ${rdlat} >> gwes_stats.inp + fi + +# 1.b.3 Create binary file for input to gwes_stats FORTRAN executable + $WGRIB2 $infile -vt -match ${valtime} -bin data_${im} + ok1=$? + +# 1.b.4 Check for errors + if [ $ok1 -ne 0 ] ; then + echo " *** ERROR : para=$para, im=$im, ok1=$ok1" + exit + fi + echo data_$im >> gwes_stats.inp + + nme=`expr ${nme} + 1` + + done + +# +# 1.c Execute gwes_stats and create grib2 files +# + rm -f mean_out spread_out probab_out test_out +# + $EXECwave/gwes_stats < gwes_stats.inp >>$pgmout 2>&1 +# +# 1.d Check for errors and move output files to tagged grib2 parameter-hour files + if [ ! -f mean_out ] + then + msg="ABNORMAL EXIT: ERR mean_out not gerenerated for ${nnip} $hhh." + postmsg "$jlogfile" "$msg" + set +x + echo "--- mean_out not gerenerated for ${nnip} $hhh --- " + [[ "$LOUD" = YES ]] && set -x + echo "mean_out not gerenerated for ${nnip} $hhh" >> $wave_log + err=1;export err;err_chk + else + mv -f mean_out ${nnip}_mean.$hhh.grib2 + fi + if [ ! -f spread_out ] + then + msg="ABNORMAL EXIT: ERR spread_out not gerenerated for ${nnip} $hhh." + postmsg "$jlogfile" "$msg" + set +x + echo "--- spread_out not gerenerated for ${nnip} $hhh --- " + [[ "$LOUD" = YES ]] && set -x + echo "spread_out not gerenerated for ${nnip} $hhh" >> $wave_log + err=1;export err;err_chk + else + mv -f spread_out ${nnip}_spread.$hhh.grib2 + fi + + nscale=`echo ${ascale[@]} | wc -w` + if [ ${nscale} -gt 1 ] + then + if [ ! -f probab_out ] + then + msg="ABNORMAL EXIT: ERR probab_out not gerenerated for ${nnip} $hhh." + postmsg "$jlogfile" "$msg" + set +x + echo "--- probab_out not gerenerated for ${nnip} $hhh --- " + [[ "$LOUD" = YES ]] && set -x + echo "probab_out not gerenerated for ${nnip} $hhh" >> $wave_log + err=1;export err;err_chk + else + mv -f probab_out ${nnip}_probab.$hhh.grib2 + fi + fi + +# +# 2. Cleanup +# Remove binary data files + rm -f data_?? + +# +# End of wave_gwes_stats.sh diff --git a/ush/wave_grib2.sh b/ush/wave_grib2.sh new file mode 100755 index 0000000000..b91f723200 --- /dev/null +++ b/ush/wave_grib2.sh @@ -0,0 +1,225 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_grib2_sbs.sh +# Script description: Create grib2 files for the wave component +# +# Author: Hendrik Tolman Org: NCEP/EMC Date: 2007-07-11 +# Abstract: Creates grib2 files from WW3 binary output +# +# Script history log: +# 2019-11-02 J-Henrique Alves Ported to global-workflow. +# +# $Id$ +# +# Attributes: +# Language: Bourne-again (BASH) shell +# Machine: WCOSS-DELL-P3 +# +# Requirements: +# - wgrib2 with IPOLATES library +# +################################################################################ +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + # set execution trace prompt. ${0##*/} adds the script's basename + PS4=" \${SECONDS} ${0##*/} L\${LINENO} + " + set -x + + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + cd $GRIBDATA +# postmsg "$jlogfile" "Making GRIB2 Files." # commented to reduce unnecessary output to jlogfile + + grdID=$1 + gribDIR=${grdID}_grib + rm -rfd ${gribDIR} + mkdir ${gribDIR} + err=$? + if [ $err != 0 ] + then + set +x + echo ' ' + echo '******************************************************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 (COULD NOT CREATE TEMP DIRECTORY) *** ' + echo '******************************************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_grib2 (Could not create temp directory)" + exit 1 + fi + + cd ${gribDIR} + +# 0.b Define directories and the search path. +# The tested variables should be exported by the postprocessor script. + + dtgrib=$2 + ngrib=$3 + GRIDNR=$4 + MODNR=$5 + gribflags=$6 + + set +x + echo ' ' + echo '+--------------------------------+' + echo '! Make GRIB files |' + echo '+--------------------------------+' + echo " Model ID : $WAV_MOD_TAG" + [[ "$LOUD" = YES ]] && set -x + + if [ -z "$YMDH" ] || [ -z "$cycle" ] || [ -z "$EXECwave" ] || [ -z "$EXECcode" ] || \ + [ -z "$COMOUT" ] || [ -z "$WAV_MOD_TAG" ] || [ -z "$SENDCOM" ] || \ + [ -z "$dtgrib" ] || [ -z "$ngrib" ] || [ -z "$gribflags" ] || \ + [ -z "$GRIDNR" ] || [ -z "$MODNR" ] || [ -z "$SENDDBN" ] + then + set +x + echo ' ' + echo '***************************************************' + echo '*** EXPORTED VARIABLES IN postprocessor NOT SET ***' + echo '***************************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "EXPORTED VARIABLES IN postprocessor NOT SET" + exit 1 + fi + +# 0.c Starting time for output + + ymdh=$YMDH + tstart="`echo $ymdh | cut -c1-8` `echo $ymdh | cut -c9-10`0000" + + set +x + echo " Starting time : $tstart" + echo " Time step : $dtgrib" + echo " Number of times : $ngrib" + echo " GRIB field flags : $gribflags" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# 0.e Links to working directory + + ln -s ../mod_def.$grdID mod_def.ww3 + ln -s ../out_grd.$grdID out_grd.ww3 + +# --------------------------------------------------------------------------- # +# 1. Generate GRIB file with all data +# 1.a Generate input file for ww3_grib2 +# Template copied in mother script ... + + set +x + echo " Generate input file for ww3_grib2" + [[ "$LOUD" = YES ]] && set -x + + sed -e "s/TIME/$tstart/g" \ + -e "s/DT/$dtgrib/g" \ + -e "s/NT/$ngrib/g" \ + -e "s/GRIDNR/$GRIDNR/g" \ + -e "s/MODNR/$MODNR/g" \ + -e "s/FLAGS/$gribflags/g" \ + ../ww3_grib2.inp.tmpl > ww3_grib.inp + +# 1.b Run GRIB packing program + + set +x + echo " Run ww3_grib2" + echo " Executing $EXECcode/ww3_grib" + [[ "$LOUD" = YES ]] && set -x + + ln -sf ../$WAV_MOD_TAG.$grdID.${cycle}.grib2 gribfile + $EXECcode/ww3_grib + err=$? + + if [ $err != 0 ] + then + set +x + echo ' ' + echo '********************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 *** ' + echo '********************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_grib2" + exit 3 + fi + +# 1.c Clean up + + rm -f ww3_grib.inp + rm -f mod_def.ww3 + rm -f out_grd.ww3 + +# 1.e Save in /com + + if [ "$SENDCOM" = 'YES' ] + then + set +x + echo " Saving GRIB file as $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2" + [[ "$LOUD" = YES ]] && set -x + cp -f ${DATA}/$WAV_MOD_TAG.$grdID.$cycle.grib2 $COMOUT/gridded/ + $WGRIB2 -s $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2 > $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2.idx + + if [ ! -f $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2 ] + then + set +x + echo ' ' + echo '********************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 *** ' + echo '********************************************* ' + echo ' ' + echo " Error in moving grib file $WAV_MOD_TAG.$grdID.$cycle.grib2 to com" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_grib2" + exit 4 + fi + if [ ! -f $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2.idx ] + then + set +x + echo ' ' + echo '*************************************************** ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 INDEX FILE *** ' + echo '*************************************************** ' + echo ' ' + echo " Error in moving grib file $WAV_MOD_TAG.$grdID.$cycle.grib2idx to com" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN creating ww3_grib2 index" + exit 4 + fi + + if [ "$SENDDBN" = 'YES' ] + then + set +x + echo " Alerting GRIB file as $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2" + echo " Alerting GRIB index file as $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2.idx" + [[ "$LOUD" = YES ]] && set -x + $DBNROOT/bin/dbn_alert MODEL WAVE_GRIB_GB2 $job $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2 + $DBNROOT/bin/dbn_alert MODEL WAVE_GRIB_GB2_WIDX $job $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2.idx + fi + fi + + +# --------------------------------------------------------------------------- # +# 3. Clean up the directory + + set +x + echo " Removing work directory after success." + [[ "$LOUD" = YES ]] && set -x + + cd .. + mv -f ${gribDIR} done.${gribDIR} + + set +x + echo ' ' + echo "End of ww3_grib2.sh at" + date + [[ "$LOUD" = YES ]] && set -x + +# End of ww3_grib2.sh -------------------------------------------------- # diff --git a/ush/wave_grib2_cat.sh b/ush/wave_grib2_cat.sh new file mode 100755 index 0000000000..ddd3e03ccd --- /dev/null +++ b/ush/wave_grib2_cat.sh @@ -0,0 +1,188 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_grib2_cat.sh +# Script description: Concatenates files from wave model component +# +# Author: Jose-Henrique Alves Org: NCEP/EMC Date: 2014-01-16 +# Abstract: Creates bulletin for NCEP Global Wave Ensemble using grib2 data. +# Values at buoy locations are extracted using wgrib2 bi-linear +# interpolation (-new_grid) and requires IPOLATES lib. +# +# Script history log: +# +# $Id$ +# +# Attributes: +# Language: Bourne-again (BASH) shell +# Machine: WCOSS-DELL-P3 +# +# Requirements: +# - wgrib2 with IPOLATES library +# +################################################################################ +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + # set execution trace prompt. ${0##*/} adds the script's basename + PS4=" \${SECONDS} ${0##*/} L\${LINENO} + " + set -x + + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + cd $DATA +# postmsg "$jlogfile" "Catting GRIB2 Files." # commented to reduce unnecessary output to jlogfile + + grdID=$1 + rm -rf grib_$grdID + mkdir grib_$grdID + err=$? + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '******************************************************************************* ' + echo '*** FATAL ERROR : ERROR IN multiwavegrib2_cat (COULD NOT CREATE TEMP DIRECTORY) *** ' + echo '******************************************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN multiwavegrib2_cat (Could not create temp directory)" + exit 1 + fi + + cd grib_$grdID + +# 0.b Define directories and the search path. +# The tested variables should be exported by the postprocessor script. + + dtgrib=$2 + ngrib=$3 + GRIDNR=$4 + MODNR=$5 + gribflags=$6 + + set +x + echo ' ' + echo '+--------------------------------+' + echo '! Make GRIB files |' + echo '+--------------------------------+' + echo " Model ID : $WAV_MOD_TAG" + [[ "$LOUD" = YES ]] && set -x + + if [ -z "$YMDH" ] || [ -z "$cycle" ] || [ -z "$EXECwave" ] || [ -z "$EXECcode" ] || \ + [ -z "$COMOUT" ] || [ -z "$WAV_MOD_TAG" ] || [ -z "$SENDCOM" ] || \ + [ -z "$SENDDBN" ] + then + set +x + echo ' ' + echo '***************************************************' + echo '*** EXPORTED VARIABLES IN postprocessor NOT SET ***' + echo '***************************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "EXPORTED VARIABLES IN postprocessor NOT SET" + exit 1 + fi + +# 0.c Starting time for output + + ymdh=$YMDH + tstart="`echo $ymdh | cut -c1-8` `echo $ymdh | cut -c9-10`0000" + + set +x + echo " Starting time : $tstart" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# 0.d sync important files + +# 0.e Links to working directory + +# --------------------------------------------------------------------------- # +# 1. Generate GRIB file with all data + +# 1.b Run GRIB packing program + + + set +x + echo " Catting grib2 files ${COMOUT}/gridded/$WAV_MOD_TAG.$grdID.$cycle.f???.grib2" + [[ "$LOUD" = YES ]] && set -x + + ln -sf ../$WAV_MOD_TAG.$grdID.$cycle.grib2 gribfile + cat ${COMOUT}/gridded/$WAV_MOD_TAG.$grdID.$cycle.f???.grib2 >> gribfile + err=$? + + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '************************************************* ' + echo '*** FATAL ERROR : ERROR IN multiwavegrib2_cat *** ' + echo '************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN multiwavegrib2_cat" + exit 3 + fi + +# 1.e Save in /com + + if [ "$SENDCOM" = 'YES' ] + then + set +x + echo " Saving GRIB file as $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2" + [[ "$LOUD" = YES ]] && set -x + cp gribfile $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2 + + if [ ! -f $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2 ] + then + set +x + echo ' ' + echo '********************************************* ' + echo '*** FATAL ERROR : ERROR IN multiwavegrib2 *** ' + echo '********************************************* ' + echo ' ' + echo " Error in moving grib file $WAV_MOD_TAG.$grdID.$cycle.grib2 to com" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN multiwavegrib2" + exit 4 + fi + + echo " Creating wgrib index of $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2" + $WGRIB2 -s $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2 > $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2.idx + + if [ "$SENDDBN" = 'YES' ] + then + set +x + echo " Alerting GRIB file as $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2" + echo " Alerting GRIB index file as $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2.idx" + [[ "$LOUD" = YES ]] && set -x + $DBNROOT/bin/dbn_alert MODEL WAVE_GRIB_GB2 $job $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2 + $DBNROOT/bin/dbn_alert MODEL WAVE_GRIB_GB2_WIDX $job $COMOUT/gridded/$WAV_MOD_TAG.$grdID.$cycle.grib2.idx + fi + fi + + +# --------------------------------------------------------------------------- # +# 3. Clean up the directory + + set +x + echo " Removing work directory after success." + [[ "$LOUD" = YES ]] && set -x + + cd .. + mv -f grib_$grdID done.grib_$grdID + + set +x + echo ' ' + echo "End of multiwavegrib2_cat.sh at" + date + [[ "$LOUD" = YES ]] && set -x + +# End of multiwavegrib2.sh -------------------------------------------------- # diff --git a/ush/wave_grib2_sbs.sh b/ush/wave_grib2_sbs.sh new file mode 100755 index 0000000000..3e8bef351c --- /dev/null +++ b/ush/wave_grib2_sbs.sh @@ -0,0 +1,222 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_grib2_sbs.sh +# Script description: Create grib2 files for the wave component +# +# Author: Hendrik Tolman Org: NCEP/EMC Date: 2007-07-11 +# Abstract: Creates grib2 files from WW3 binary output +# +# Script history log: +# 2019-11-02 J-Henrique Alves Ported to global-workflow. +# +# $Id$ +# +# Attributes: +# Language: Bourne-again (BASH) shell +# Machine: WCOSS-DELL-P3 +# +# Requirements: +# - wgrib2 with IPOLATES library +# +################################################################################ +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + # set execution trace prompt. ${0##*/} adds the script's basename + PS4=" \${SECONDS} ${0##*/} L\${LINENO} + " + set -x + + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + cd $GRIBDATA +# postmsg "$jlogfile" "Making GRIB2 Files." # commented to reduce unnecessary output to jlogfile + + grdID=$1 + gribDIR=${grdID}_grib + rm -rfd ${gribDIR} + mkdir ${gribDIR} + err=$? + if [ $err != 0 ] + then + set +x + echo ' ' + echo '******************************************************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 (COULD NOT CREATE TEMP DIRECTORY) *** ' + echo '******************************************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_grib2 (Could not create temp directory)" + exit 1 + fi + + cd ${gribDIR} + +# 0.b Define directories and the search path. +# The tested variables should be exported by the postprocessor script. + + GRIDNR=$2 + MODNR=$3 + ymdh=$4 + fhr=$5 + grdnam=$6 + grdres=$7 + gribflags=$8 + ngrib=1 # only one time slice + dtgrib=3600 # only one time slice +# SBS one time slice per file + FH3=$(printf %03i $fhr) + + set +x + echo ' ' + echo '+--------------------------------+' + echo '! Make GRIB files |' + echo '+--------------------------------+' + echo " Model ID : $WAV_MOD_TAG" + [[ "$LOUD" = YES ]] && set -x + + if [ -z "$CDATE" ] || [ -z "$cycle" ] || [ -z "$EXECwave" ] || [ -z "$EXECcode" ] || \ + [ -z "$COMOUT" ] || [ -z "$WAV_MOD_TAG" ] || [ -z "$SENDCOM" ] || \ + [ -z "$gribflags" ] || \ + [ -z "$GRIDNR" ] || [ -z "$MODNR" ] || [ -z "$SENDDBN" ] + then + set +x + echo ' ' + echo '***************************************************' + echo '*** EXPORTED VARIABLES IN postprocessor NOT SET ***' + echo '***************************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "EXPORTED VARIABLES IN postprocessor NOT SET" + exit 1 + fi + +# 0.c Starting time for output + + tstart="`echo $ymdh | cut -c1-8` `echo $ymdh | cut -c9-10`0000" + + set +x + echo " Starting time : $tstart" + echo " Time step : Single SBS + echo " Number of times : Single SBS + echo " GRIB field flags : $gribflags" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# 0.e Links to working directory + + ln -s ${DATA}/mod_def.$grdID mod_def.ww3 + ln -s ${DATA}/output_${ymdh}0000/out_grd.$grdID out_grd.ww3 + +# --------------------------------------------------------------------------- # +# 1. Generate GRIB file with all data +# 1.a Generate input file for ww3_grib2 +# Template copied in mother script ... + + set +x + echo " Generate input file for ww3_grib2" + [[ "$LOUD" = YES ]] && set -x + + sed -e "s/TIME/$tstart/g" \ + -e "s/DT/$dtgrib/g" \ + -e "s/NT/$ngrib/g" \ + -e "s/GRIDNR/$GRIDNR/g" \ + -e "s/MODNR/$MODNR/g" \ + -e "s/FLAGS/$gribflags/g" \ + ${DATA}/ww3_grib2.${grdID}.inp.tmpl > ww3_grib.inp + +# 1.b Run GRIB packing program + + set +x + echo " Run ww3_grib2" + echo " Executing $EXECcode/ww3_grib" + [[ "$LOUD" = YES ]] && set -x + ENSTAG="" + if [ ${waveMEMB} ]; then ENSTAG=".${membTAG}${waveMEMB}" ; fi + outfile=${WAV_MOD_TAG}.${cycle}${ENSTAG}.${grdnam}.${grdres}.f${FH3}.grib2 + $EXECcode/ww3_grib + $WGRIB2 gribfile -set_date $CDATE -set_ftime "$fhr hour fcst" -grib ${COMOUT}/gridded/${outfile} + err=$? + + if [ $err != 0 ] + then + set +x + echo ' ' + echo '********************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 *** ' + echo '********************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_grib2" + exit 3 + fi + +# Create index + $WGRIB2 -s $COMOUT/gridded/${outfile} > $COMOUT/gridded/${outfile}.idx + +# 1.e Save in /com + + if [ ! -s $COMOUT/gridded/${outfile} ] + then + set +x + echo ' ' + echo '********************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 *** ' + echo '********************************************* ' + echo ' ' + echo " Error in moving grib file ${outfile} to com" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_grib2" + exit 4 + fi + if [ ! -s $COMOUT/gridded/${outfile} ] + then + set +x + echo ' ' + echo '*************************************************** ' + echo '*** FATAL ERROR : ERROR IN ww3_grib2 INDEX FILE *** ' + echo '*************************************************** ' + echo ' ' + echo " Error in moving grib file ${outfile}.idx to com" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN creating ww3_grib2 index" + exit 4 + fi + + if [ "$SENDDBN" = 'YES' ] + then + set +x + echo " Alerting GRIB file as $COMOUT/gridded/${outfile}" + echo " Alerting GRIB index file as $COMOUT/gridded/${outfile}.idx" + [[ "$LOUD" = YES ]] && set -x + $DBNROOT/bin/dbn_alert MODEL WAVE_GRIB_GB2 $job $COMOUT/gridded/${outfile} + $DBNROOT/bin/dbn_alert MODEL WAVE_GRIB_GB2_WIDX $job $COMOUT/gridded/${outfile}.idx + fi + + +# --------------------------------------------------------------------------- # +# 3. Clean up the directory + + rm -f gribfile + + set +x + echo " Removing work directory after success." + [[ "$LOUD" = YES ]] && set -x + + cd ../ + mv -f ${gribDIR} done.${gribDIR} + + set +x + echo ' ' + echo "End of ww3_grib2.sh at" + date + [[ "$LOUD" = YES ]] && set -x + +# End of ww3_grib2.sh -------------------------------------------------- # diff --git a/ush/wave_grid_interp.sh b/ush/wave_grid_interp.sh new file mode 100755 index 0000000000..bb91840212 --- /dev/null +++ b/ush/wave_grid_interp.sh @@ -0,0 +1,209 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_grid_interp.sh +# Script description: Create grib2 files for the wave component +# +# Author: Arun Chawla Org: NCEP/EMC Date: 2009-07-22 +# Abstract: Creates grib2 files from WW3 binary output +# +# Script history log: +# 2019-11-02 J-Henrique Alves Ported to global-workflow. +# +# $Id$ +############################################################################### +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + # set execution trace prompt. ${0##*/} adds the script's basename + PS4=" \${SECONDS} ${0##*/} L\${LINENO} + " + set -x + + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + cd $DATA + + grdID=$1 + ymdh=$2 + dt=$3 + nst=$4 + postmsg "$jlogfile" "Making GRID Interpolation Files for $grdID." + rm -rf grint_${grdID}_${ymdh} + mkdir grint_${grdID}_${ymdh} + err=$? + + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '************************************************************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grid_interp (COULD NOT CREATE TEMP DIRECTORY) *** ' + echo '************************************************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_grid_interp (Could not create temp directory)" + exit 1 + fi + + cd grint_${grdID}_${ymdh} + +# 0.b Define directories and the search path. +# The tested variables should be exported by the postprocessor script. + + set +x + echo ' ' + echo '+--------------------------------+' + echo '! Make GRID files |' + echo '+--------------------------------+' + echo " Model ID : $WAV_MOD_TAG" + [[ "$LOUD" = YES ]] && set -x + + if [ -z "$YMDH" ] || [ -z "$cycle" ] || [ -z "$EXECcode" ] || \ + [ -z "$COMOUT" ] || [ -z "$WAV_MOD_TAG" ] || [ -z "$SENDCOM" ] || \ + [ -z "$SENDDBN" ] || [ -z "$waveGRD" ] + then + set +x + echo ' ' + echo '***************************************************' + echo '*** EXPORTED VARIABLES IN postprocessor NOT SET ***' + echo '***************************************************' + echo ' ' + echo "$YMDH $cycle $EXECcode $COMOUT $WAV_MOD_TAG $SENDCOM $SENDDBN $waveGRD" + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "EXPORTED VARIABLES IN postprocessor NOT SET" + exit 1 + fi + +# 0.c Links to files + + rm -f ../out_grd.$grdID + + if [ ! -f ../${grdID}_interp.inp.tmpl ]; then + cp $FIXwave/${grdID}_interp.inp.tmpl ../. + fi + ln -sf ../${grdID}_interp.inp.tmpl . + + for ID in $waveGRD + do + ln -sf ../out_grd.$ID . + done + + for ID in $waveGRD $grdID + do + ln -sf ../mod_def.$ID . + done + +# --------------------------------------------------------------------------- # +# 1. Generate GRID file with all data +# 1.a Generate Input file + + time="`echo $ymdh | cut -c1-8` `echo $ymdh | cut -c9-10`0000" + + sed -e "s/TIME/$time/g" \ + -e "s/DT/$dt/g" \ + -e "s/NSTEPS/$nst/g" ${grdID}_interp.inp.tmpl > ww3_gint.inp + +# Check if there is an interpolation weights file available + + wht_OK='no' + if [ ! -f ${DATA}/WHTGRIDINT.bin.${grdID} ]; then + if [ -f $FIXwave/WHTGRIDINT.bin.${grdID} ] + then + set +x + echo ' ' + echo " Copying $FIXwave/WHTGRIDINT.bin.${grdID} " + [[ "$LOUD" = YES ]] && set -x + cp $FIXwave/WHTGRIDINT.bin.${grdID} ${DATA} + wht_OK='yes' + else + set +x + echo ' ' + echo " Not found: $FIXwave/WHTGRIDINT.bin.${grdID} " + fi + fi +# Check and link weights file + if [ -f ${DATA}/WHTGRIDINT.bin.${grdID} ] + then + ln -s ${DATA}/WHTGRIDINT.bin.${grdID} ./WHTGRIDINT.bin + fi + +# 1.b Run interpolation code + + set +x + echo " Run ww3_gint + echo " Executing $EXECcode/ww3_gint + [[ "$LOUD" = YES ]] && set -x + + $EXECcode/ww3_gint + err=$? + +# Write interpolation file to main TEMP dir area if not there yet + if [ "wht_OK" = 'no' ] + then + cp -f ./WHTGRIDINT.bin ${DATA}/WHTGRIDINT.bin.${grdID} + cp -f ./WHTGRIDINT.bin ${FIXwave}/WHTGRIDINT.bin.${grdID} + fi + + + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '*************************************************** ' + echo '*** FATAL ERROR : ERROR IN ww3_gint interpolation * ' + echo '*************************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_gint interpolation" + exit 3 + fi + +# 1.b Clean up + + rm -f grid_interp.inp + rm -f mod_def.* + mv out_grd.$grdID ../out_grd.$grdID + +# 1.c Save in /com + + if [ "$SENDCOM" = 'YES' ] + then + set +x + echo " Saving GRID file as $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.$PDY$cyc" + [[ "$LOUD" = YES ]] && set -x + cp ../out_grd.$grdID $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.$PDY$cyc + +# if [ "$SENDDBN" = 'YES' ] +# then +# set +x +# echo " Alerting GRID file as $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.$PDY$cyc +# [[ "$LOUD" = YES ]] && set -x + +# +# PUT DBNET ALERT HERE .... +# + +# fi + fi + +# --------------------------------------------------------------------------- # +# 2. Clean up the directory + + set +x + echo " Removing work directory after success." + [[ "$LOUD" = YES ]] && set -x + + cd .. + mv -f grint_${grdID}_${ymdh} done.grint_${grdID}_${ymdh} + + set +x + echo ' ' + echo "End of ww3_interp.sh at" + date + +# End of ww3_grid_interp.sh -------------------------------------------- # diff --git a/ush/wave_grid_interp_sbs.sh b/ush/wave_grid_interp_sbs.sh new file mode 100755 index 0000000000..0f78d3f557 --- /dev/null +++ b/ush/wave_grid_interp_sbs.sh @@ -0,0 +1,217 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_grid_interp_sbs.sh +# Script description: Interpolate from native grids to target grid +# +# Author: J-Henrique Alves Org: NCEP/EMC Date: 2019-11-02 +# Abstract: Creates grib2 files from WW3 binary output +# +# Script history log: +# 2019-11-02 J-Henrique Alves Ported to global-workflow. +# +# $Id$ +# +# Attributes: +# Language: Bourne-again (BASH) shell +# Machine: WCOSS-DELL-P3 +# +# Requirements: +# - wgrib2 with IPOLATES library +# +################################################################################ +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + # set execution trace prompt. ${0##*/} adds the script's basename + PS4=" \${SECONDS} ${0##*/} L\${LINENO} + " + set -x + + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + cd $GRDIDATA + + grdID=$1 + ymdh=$2 + dt=$3 + nst=$4 + postmsg "$jlogfile" "Making GRID Interpolation Files for $grdID." + rm -rf grint_${grdID}_${ymdh} + mkdir grint_${grdID}_${ymdh} + err=$? + + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '************************************************************************************* ' + echo '*** FATAL ERROR : ERROR IN ww3_grid_interp (COULD NOT CREATE TEMP DIRECTORY) *** ' + echo '************************************************************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_grid_interp (Could not create temp directory)" + exit 1 + fi + + cd grint_${grdID}_${ymdh} + +# 0.b Define directories and the search path. +# The tested variables should be exported by the postprocessor script. + + set +x + echo ' ' + echo '+--------------------------------+' + echo '! Make GRID files |' + echo '+--------------------------------+' + echo " Model ID : $WAV_MOD_TAG" + [[ "$LOUD" = YES ]] && set -x + + if [ -z "$CDATE" ] || [ -z "$cycle" ] || [ -z "$EXECcode" ] || \ + [ -z "$COMOUT" ] || [ -z "$WAV_MOD_TAG" ] || [ -z "$SENDCOM" ] || \ + [ -z "$SENDDBN" ] || [ -z "$waveGRD" ] + then + set +x + echo ' ' + echo '***************************************************' + echo '*** EXPORTED VARIABLES IN postprocessor NOT SET ***' + echo '***************************************************' + echo ' ' + echo "$CDATE $cycle $EXECcode $COMOUT $WAV_MOD_TAG $SENDCOM $SENDDBN $waveGRD" + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "EXPORTED VARIABLES IN postprocessor NOT SET" + exit 1 + fi + +# 0.c Links to files + + rm -f ${DATA}/output_${ymdh}0000/out_grd.$grdID + + if [ ! -f ${DATA}/${grdID}_interp.inp.tmpl ]; then + cp $FIXwave/${grdID}_interp.inp.tmpl ${DATA} + fi + ln -sf ${DATA}/${grdID}_interp.inp.tmpl . + + for ID in $waveGRD + do + ln -sf ${DATA}/output_${ymdh}0000/out_grd.$ID . + done + + for ID in $waveGRD $grdID + do + ln -sf ${DATA}/mod_def.$ID . + done + +# --------------------------------------------------------------------------- # +# 1. Generate GRID file with all data +# 1.a Generate Input file + + time="`echo $ymdh | cut -c1-8` `echo $ymdh | cut -c9-10`0000" + + sed -e "s/TIME/$time/g" \ + -e "s/DT/$dt/g" \ + -e "s/NSTEPS/$nst/g" ${grdID}_interp.inp.tmpl > ww3_gint.inp + +# Check if there is an interpolation weights file available + + wht_OK='no' + if [ ! -f ${DATA}/WHTGRIDINT.bin.${grdID} ]; then + if [ -f $FIXwave/WHTGRIDINT.bin.${grdID} ] + then + set +x + echo ' ' + echo " Copying $FIXwave/WHTGRIDINT.bin.${grdID} " + [[ "$LOUD" = YES ]] && set -x + cp $FIXwave/WHTGRIDINT.bin.${grdID} ${DATA} + wht_OK='yes' + else + set +x + echo ' ' + echo " Not found: $FIXwave/WHTGRIDINT.bin.${grdID} " + fi + fi +# Check and link weights file + if [ -f ${DATA}/WHTGRIDINT.bin.${grdID} ] + then + ln -s ${DATA}/WHTGRIDINT.bin.${grdID} ./WHTGRIDINT.bin + fi + +# 1.b Run interpolation code + + set +x + echo " Run ww3_gint + echo " Executing $EXECcode/ww3_gint + [[ "$LOUD" = YES ]] && set -x + + $EXECcode/ww3_gint + err=$? + +# Write interpolation file to main TEMP dir area if not there yet + if [ "wht_OK" = 'no' ] + then + cp -f ./WHTGRIDINT.bin ${DATA}/WHTGRIDINT.bin.${grdID} + cp -f ./WHTGRIDINT.bin ${FIXwave}/WHTGRIDINT.bin.${grdID} + fi + + + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '*************************************************** ' + echo '*** FATAL ERROR : ERROR IN ww3_gint interpolation * ' + echo '*************************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_gint interpolation" + exit 3 + fi + +# 1.b Clean up + + rm -f grid_interp.inp + rm -f mod_def.* + mv out_grd.$grdID ${DATA}/output_${ymdh}0000/out_grd.$grdID + +# 1.c Save in /com + + if [ "$SENDCOM" = 'YES' ] + then + set +x + echo " Saving GRID file as $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.${CDATE}" + [[ "$LOUD" = YES ]] && set -x + cp ${DATA}/output_${ymdh}0000/out_grd.$grdID $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.${CDATE} + +# if [ "$SENDDBN" = 'YES' ] +# then +# set +x +# echo " Alerting GRID file as $COMOUT/rundata/$WAV_MOD_TAG.out_grd.$grdID.${CDATE} +# [[ "$LOUD" = YES ]] && set -x + +# +# PUT DBNET ALERT HERE .... +# + +# fi + fi + +# --------------------------------------------------------------------------- # +# 2. Clean up the directory + + set +x + echo " Removing work directory after success." + [[ "$LOUD" = YES ]] && set -x + + cd ../ + mv -f grint_${grdID}_${ymdh} done.grint_${grdID}_${ymdh} + + set +x + echo ' ' + echo "End of ww3_interp.sh at" + date + +# End of ww3_grid_interp.sh -------------------------------------------- # diff --git a/ush/wave_grid_moddef.sh b/ush/wave_grid_moddef.sh new file mode 100755 index 0000000000..42976286aa --- /dev/null +++ b/ush/wave_grid_moddef.sh @@ -0,0 +1,136 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_grid_moddef.sh +# Script description: Create grib2 files for the wave component +# +# Author: J-Henrique Alves Org: NCEP/EMC Date: 2011-04-08 +# Abstract: Creates model definition files for the wave model WW3 +# +# Script history log: +# 2019-11-02 J-Henrique Alves Ported to global-workflow. +# +# $Id$ +# # +############################################################################### +# +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + # set execution trace prompt. ${0##*/} adds the script's basename + PS4=" \${SECONDS} ${0##*/} L\${LINENO} + " + set -x + + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + postmsg "$jlogfile" "Generating mod_def file" + + mkdir -p moddef_${1} + cd moddef_${1} + + grdID=$1 + + set +x + echo ' ' + echo '+--------------------------------+' + echo '! Generate moddef file |' + echo '+--------------------------------+' + echo " Grid : $1" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# 0.b Check if grid set + + if [ "$#" -lt '1' ] + then + set +x + echo ' ' + echo '**************************************************' + echo '*** Grid not identifife in ww3_mod_def.sh ***' + echo '**************************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "GRID IN ww3_mod_def.sh NOT SET" + exit 1 + else + grdID=$1 + fi + +# 0.c Define directories and the search path. +# The tested variables should be exported by the postprocessor script. + + if [ -z "$grdID" ] || [ -z "$EXECcode" ] || [ -z "$wave_sys_ver" ] + then + set +x + echo ' ' + echo '*********************************************************' + echo '*** EXPORTED VARIABLES IN ww3_mod_def.sh NOT SET ***' + echo '*********************************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "EXPORTED VARIABLES IN ww3_mod_def.sh NOT SET" + exit 2 + fi + +# --------------------------------------------------------------------------- # +# 2. Create mod_def file + + set +x + echo ' ' + echo ' Creating mod_def file ...' + echo " Executing $EXECcode/ww3_grid" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + + rm -f ww3_grid.inp + ln -sf ../ww3_grid.inp.$grdID ww3_grid.inp + + $EXECcode/ww3_grid + err=$? + + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '******************************************** ' + echo '*** FATAL ERROR : ERROR IN ww3_grid *** ' + echo '******************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_grid" + exit 3 + fi + + if [ -f mod_def.ww3 ] + then + cp mod_def.ww3 $COMOUT/rundata/${COMPONENTwave}.mod_def.${grdID} + mv mod_def.ww3 ../mod_def.$grdID + else + set +x + echo ' ' + echo '******************************************** ' + echo '*** FATAL ERROR : MOD DEF FILE NOT FOUND *** ' + echo '******************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : Mod def File creation FAILED" + exit 4 + fi + +# --------------------------------------------------------------------------- # +# 3. Clean up + + cd .. + #rm -rf moddef_$grdID + + set +x + echo ' ' + echo 'End of ww3_mod_def.sh at' + date + +# End of ww3_mod_def.sh ------------------------------------------------- # diff --git a/ush/wave_outp_spec.sh b/ush/wave_outp_spec.sh new file mode 100755 index 0000000000..9cbcf6e9a6 --- /dev/null +++ b/ush/wave_outp_spec.sh @@ -0,0 +1,260 @@ +#!/bin/bash +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_outp_spec.sh +# Script description: Generates ASCII data files with the wave spectral data +# +# Author: Hendrik Tolman Org: NCEP/EMC Date: 2007-03-17 +# Abstract: Creates grib2 files from WW3 binary output +# +# Script history log: +# 2019-11-02 J-Henrique Alves Ported to global-workflow. +# +# $Id$ +# +# Attributes: +# Language: Bourne-again (BASH) shell +# Machine: WCOSS-DELL-P3 +# +################################################################################ +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + # set execution trace prompt. ${0##*/} adds the script's basename + PS4=" \${SECONDS} ${0##*/} L\${LINENO} + " + set -x + + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + bloc=$1 + ymdh=$2 + specdir=$3 + + YMDHE=`$NDATE $FHMAX_WAV $CDATE` + + cd $SPECDATA + + rm -rf ${specdir}_${bloc} + mkdir ${specdir}_${bloc} + err=$? + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '****************************************************************************** ' + echo '*** FATAL ERROR : ERROR IN ww3_outp_spec (COULD NOT CREATE TEMP DIRECTORY) *** ' + echo '****************************************************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_outp_spec (Could not create temp directory)" + exit 1 + fi + + cd ${specdir}_${bloc} + + set +x + echo ' ' + echo '+--------------------------------+' + echo '! Make spectral file |' + echo '+--------------------------------+' + echo " Model ID : $WAV_MOD_TAG" + [[ "$LOUD" = YES ]] && set -x + +# 0.b Check if buoy location set + + if [ "$#" -lt '1' ] + then + set +x + echo ' ' + echo '***********************************************' + echo '*** LOCATION ID IN ww3_outp_spec.sh NOT SET ***' + echo '***********************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "LOCATION ID IN ww3_outp_spec.sh NOT SET" + exit 1 + else + buoy=$bloc + grep $buoy ${DATA}/buoy_log.ww3 > tmp_list.loc + while read line + do + buoy_name=`echo $line | awk '{print $2}'` + if [ $buoy = $buoy_name ] + then + point=`echo $line | awk '{ print $1 }'` + set +x + echo " Location ID/# : $buoy (${point})" + echo " Spectral output start time : $ymdh " + echo ' ' + [[ "$LOUD" = YES ]] && set -x + break + fi + done < tmp_list.loc + if [ -z "$point" ] + then + set +x + echo '******************************************************' + echo '*** LOCATION ID IN ww3_outp_spec.sh NOT RECOGNIZED ***' + echo '******************************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "LOCATION ID IN ww3_outp_spec.sh NOT RECOGNIZED" + exit 2 + fi + fi + + +# 0.c Define directories and the search path. +# The tested variables should be exported by the postprocessor script. + + if [ -z "$CDATE" ] || [ -z "$dtspec" ] || [ -z "$EXECcode" ] || \ + [ -z "$WAV_MOD_TAG" ] || [ -z "${STA_DIR}" ] + then + set +x + echo ' ' + echo '******************************************************' + echo '*** EXPORTED VARIABLES IN ww3_outp_spec.sh NOT SET ***' + echo '******************************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "EXPORTED VARIABLES IN ww3_outp_spec.sh NOT SET" + exit 3 + fi + +# 0.d Starting time for output + + tstart="`echo $ymdh | cut -c1-8` `echo $ymdh | cut -c9-10`0000" + YMD="`echo $ymdh | cut -c1-8`" + HMS="`echo $ymdh | cut -c9-10`0000" + set +x + echo " Output starts at $tstart." + echo ' ' + [[ "$LOUD" = YES ]] && set -x + +# 0.e sync important files + +# $FSYNC ${DATA}/mod_def.${waveuoutpGRD} +# $FSYNC ${DATA}/out_pnt.${waveuoutpGRD} +# $FSYNC ${DATA}/ww3_outp_spec.inp.tmpl + +# 0.f Links to mother directory + + ln -s ${DATA}/mod_def.${waveuoutpGRD} ./mod_def.ww3 + ln -s ${DATA}/output_${ymdh}0000/out_pnt.${waveuoutpGRD} ./out_pnt.ww3 + +# --------------------------------------------------------------------------- # +# 2. Generate spectral data file +# 2.a Input file for postprocessor + + set +x + echo " Generate input file for ww3_outp." + [[ "$LOUD" = YES ]] && set -x + + if [ "$specdir" = "bull" ] + then + tstart="`echo $ymdh | cut -c1-8` `echo $ymdh | cut -c9-10`0000" + truntime="`echo $CDATE | cut -c1-8` `echo $YMDH | cut -c9-10`0000" + sed -e "s/TIME/$tstart/g" \ + -e "s/DT/$dtspec/g" \ + -e "s/POINT/$point/g" \ + -e "s/REFT/$truntime/g" \ + ${DATA}/ww3_outp_bull.inp.tmpl > ww3_outp.inp + outfile=${buoy}.bull + coutfile=${buoy}.cbull + else + sed -e "s/TIME/$tstart/g" \ + -e "s/DT/$dtspec/g" \ + -e "s/POINT/$point/g" \ + -e "s/ITYPE/1/g" \ + -e "s/FORMAT/F/g" \ + ${DATA}/ww3_outp_spec.inp.tmpl > ww3_outp.inp + outfile=ww3.`echo $tstart | cut -c3-8``echo $tstart | cut -c10-11`.spc + fi + +# 2.b Run the postprocessor + + set +x + echo " Executing $EXECcode/ww3_outp" + [[ "$LOUD" = YES ]] && set -x + + $EXECcode/ww3_outp + err=$? + + if [ "$err" != '0' ] + then + set +x + echo ' ' + echo '******************************************** ' + echo '*** FATAL ERROR : ERROR IN ww3_outp *** ' + echo '******************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : ERROR IN ww3_outp" + exit 4 + fi + +# --------------------------------------------------------------------------- # +# 3. Clean up +# 3.a Move data to directory for station ascii files + + if [ -f $outfile ] + then + if [ "${ymdh}" = "${CDATE}" ] + then + if [ "$specdir" = "bull" ] + then + cat $outfile | sed -e '9,$d' >> ${STA_DIR}/${specdir}/$WAV_MOD_TAG.$buoy.bull + cat $coutfile | sed -e '8,$d' >> ${STA_DIR}/c${specdir}/$WAV_MOD_TAG.$buoy.cbull + else + #cat $outfile | sed -e '15,$d' >> ${STA_DIR}/${specdir}/$WAV_MOD_TAG.$buoy.spec + cat $outfile >> ${STA_DIR}/${specdir}/$WAV_MOD_TAG.$buoy.spec + fi + elif [ "${ymdh}" = "${YMDHE}" ] + then + if [ "$specdir" = "bull" ] + then + cat $outfile | sed -e '1,7d' >> ${STA_DIR}/${specdir}/$WAV_MOD_TAG.$buoy.bull + cat $coutfile | sed -e '1,6d' >> ${STA_DIR}/c${specdir}/$WAV_MOD_TAG.$buoy.cbull + else + cat $outfile | sed -n "/^${YMD} ${HMS}$/,\$p" >> ${STA_DIR}/${specdir}/$WAV_MOD_TAG.$buoy.spec + fi + else + if [ "$specdir" = "bull" ] + then + cat $outfile | sed -e '1,7d' | sed -e '2,$d' >> ${STA_DIR}/${specdir}/$WAV_MOD_TAG.$buoy.bull + cat $coutfile | sed -e '1,6d' | sed -e '2,$d' >> ${STA_DIR}/c${specdir}/$WAV_MOD_TAG.$buoy.cbull + else + cat $outfile | sed -n "/^${YMD} ${HMS}$/,\$p" >> ${STA_DIR}/${specdir}/$WAV_MOD_TAG.$buoy.spec + fi + fi + else + set +x + echo ' ' + echo '***************************************************************** ' + echo '*** FATAL ERROR : OUTPUT DATA FILE FOR BOUY $bouy NOT FOUND *** ' + echo '***************************************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : OUTPUT DATA FILE FOR BOUY $bouy NOT FOUND" + exit 5 + fi + +# 3.b Clean up the rest + +# rm -f ww3_outp.inp +# rm -f mod_def.ww3 out_pnt.ww3 + + cd .. + mv -f $specdir_$buoy done.$specdir_$buoy + + set +x + echo ' ' + echo 'End of ww3_outp_spec.sh at' + date + +# End of ww3_outp_spec.sh ---------------------------------------------------- # diff --git a/ush/wave_prnc_cur.sh b/ush/wave_prnc_cur.sh new file mode 100755 index 0000000000..1a6dd3608f --- /dev/null +++ b/ush/wave_prnc_cur.sh @@ -0,0 +1,75 @@ +#!/bin/sh +# +################################################################################ +# +# UNIX Script Documentation Block +# Script name: wave_prns_cur.sh +# Script description: Acquires current data and generates binary input for WW3 +# +# Author: J.-Henrique Alves Org: NCEP/EMC Date: 2019-11-06 +# Abstract: Creates current binary data for forcing WW3 +# +# Script history log: +# +# $Id$ +# +# Attributes: +# Language: Bourne-again (BASH) shell +# Machine: WCOSS-DELL-P3 +# +################################################################################ +# +set -x + +ymdh_rtofs=$1 +curfile=$2 + +# Timing has to be made relative to the single 00z RTOFS cycle for that PDY + +mkdir -p rtofs_${ymdh_rtofs} +cd rtofs_${ymdh_rtofs} + +ncks -x -v sst,sss,layer_density $curfile cur_uv_${PDY}_${fext}${fhr}.nc +ncks -O -a -h -x -v Layer cur_uv_${PDY}_${fext}${fhr}.nc cur_temp1.nc +ncwa -h -O -a Layer cur_temp1.nc cur_temp2.nc +ncrename -h -O -v MT,time cur_temp2.nc +ncrename -h -O -d MT,time cur_temp2.nc +ncks -v u_velocity,v_velocity cur_temp2.nc cur_temp3.nc +mv -f cur_temp3.nc cur_uv_${PDY}_${fext}${fhr}_flat.nc + +# Convert to regular lat lon file + +cp ${FIXwave}/weights_rtofs_to_r4320x2160.nc ./weights.nc + +# Interpolate to regular 5 min grid +$CDO remap,r4320x2160,weights.nc cur_uv_${PDY}_${fext}${fhr}_flat.nc cur_5min_01.nc + +# Perform 9-point smoothing twice to make RTOFS data less noisy when +# interpolating from 1/12 deg RTOFS grid to 1/6 deg wave grid +if [ "WAV_CUR_CDO_SMOOTH" = "YES" ]; then + $CDO -f nc -smooth9 cur_5min_01.nc cur_5min_02.nc + $CDO -f nc -smooth9 cur_5min_02.nc cur_glo_uv_${PDY}_${fext}${fhr}_5min.nc +else + mv cur_5min_01.nc cur_glo_uv_${PDY}_${fext}${fhr}_5min.nc +fi + +# Cleanup +rm -f cur_temp[123].nc cur_5min_??.nc cur_glo_uv_${PDY}_${fext}${fhr}.nc weights.nc + +if [ ${fhr_wave} -gt ${WAVHINDH} ] +then + sed -e "s/HDRFL/F/g" ${FIXwave}/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp +else + sed -e "s/HDRFL/T/g" ${FIXwave}/ww3_prnc.cur.${WAVECUR_FID}.inp.tmpl > ww3_prnc.inp +fi + +rm -f cur.nc +ln -s cur_glo_uv_${PDY}_${fext}${fhr}_5min.nc cur.nc +ln -s ${DATA}/mod_def.rtofs_5m ./mod_def.ww3 + +$EXECcode/ww3_prnc + +mv -f current.ww3 ${DATA}/${WAVECUR_FID}.${ymdh_rtofs} + +cd ${DATA} + diff --git a/ush/wave_prnc_ice.sh b/ush/wave_prnc_ice.sh new file mode 100755 index 0000000000..328b8c6728 --- /dev/null +++ b/ush/wave_prnc_ice.sh @@ -0,0 +1,198 @@ +#!/bin/sh +############################################################################### +# # +# This script preprocesses ice fields for the ocean wave models. # +# It is run as a child scipt by the corresponding preprocessig script. # +# # +# Remarks : # +# - This script runs in the work directory designated in the mother script in # +# which it generates its own sub-directory 'ice'. # +# - Because this script is not essential for the running for the wave model # +# (as long as it runs every now and then) the error exit codes are set to # +# 0. The main program script will then not find the file ice.ww3 and send # +# a message to the wave.log file. # +# - See section 0.b for variables that need to be set. # +# # +# Update record : # +# # +# - Origination: Hendrik Tolman 01-Mar-2007 # +# # +# Update log # +# Nov2019 JHAlves - Merging wave scripts to global workflow # +# # +############################################################################### +# +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + cd $DATA + seton='-xa' + setoff='+xa' + set $seton + + rm -rf ice + mkdir ice + cd ice + ln -s ${DATA}/postmsg . + +# 0.b Define directories and the search path. +# The tested variables should be exported by the postprocessor script. + + set $setoff + echo ' ' + echo '+--------------------------------+' + echo '! Make ice fields |' + echo '+--------------------------------+' + echo " Model TAG : $WAV_MOD_TAG" + echo " Model ID : $COMPONENTwave" + echo " Ice grid ID : $WAVEICE_FID" + echo " Ice file : $WAVICEFILE" + echo ' ' + set $seton + postmsg "$jlogfile" "Making ice fields." + + if [ -z "$YMDH" ] || [ -z "$cycle" ] || \ + [ -z "$COMOUT" ] || [ -z "$FIXwave" ] || [ -z "$EXECcode" ] || \ + [ -z "$WAV_MOD_TAG" ] || [ -z "$WAVEICE_FID" ] || [ -z "$SENDCOM" ] || \ + [ -z "$COMIN_WAV_ICE" ] || [ -z "$COMPONENTwave" ] + then + set $setoff + echo ' ' + echo '**************************************************' + echo '*** EXPORTED VARIABLES IN preprocessor NOT SET ***' + echo '**************************************************' + echo ' ' + exit 0 + set $seton + postmsg "$jlogfile" "NON-FATAL ERROR - EXPORTED VARIABLES IN preprocessor NOT SET" + fi + +# 0.c Links to working directory + + ln -s ${DATA}/mod_def.$WAVEICE_FID mod_def.ww3 + +# --------------------------------------------------------------------------- # +# 1. Get the necessary files +# 1.a Copy the ice data file + + file=${COMIN_WAV_ICE}/${WAVICEFILE} + + if [ -f $file ] + then + cp $file ice.grib + fi + + if [ -f ice.grib ] + then + set $setoff + echo " ice.grib copied ($file)." + set $seton + else + set $setoff + echo ' ' + echo '************************************** ' + echo "*** FATAL ERROR: NO ICE FILE $file *** " + echo '************************************** ' + echo ' ' + set $seton + postmsg "$jlogfile" "FATAL ERROR - NO ICE FILE (GFS GRIB)" + exit 0 + fi + +# --------------------------------------------------------------------------- # +# 2. Process the GRIB packed ice file +# 2.a Unpack data + + set $setoff + echo ' Extracting data from ice.grib ...' + set $seton + + $WGRIB2 ice.grib -netcdf icean_5m.nc 2>&1 > wgrib.out + + + err=$? + + if [ "$err" != '0' ] + then + cat wgrib.out + set $setoff + echo ' ' + echo '**************************************** ' + echo '*** ERROR IN UNPACKING GRIB ICE FILE *** ' + echo '**************************************** ' + echo ' ' + set $seton + postmsg "$jlogfile" "ERROR IN UNPACKING GRIB ICE FILE." + exit 0 + fi + + rm -f wgrib.out + rm -f ice.grib + rm -f ice.index + + +# 2.d Run through preprocessor wave_prep + + set $setoff + echo ' Run through preprocessor ...' + echo ' ' + set $seton + + cp -f ${DATA}/ww3_prnc.ice.$WAVEICE_FID.inp.tmpl ww3_prnc.inp + + $EXECcode/ww3_prnc > wave_prnc.out + err=$? + + if [ "$err" != '0' ] + then + cat wave_prep.out + set $setoff + echo ' ' + echo '************************* ' + echo '*** ERROR IN waveprep *** ' + echo '************************* ' + echo ' ' + set $seton + postmsg "$jlogfile" "NON-FATAL ERROR IN waveprep." + exit 0 + fi + + rm -f wave_prep.out ww3_prep.inp ice.raw mod_def.ww3 + +# --------------------------------------------------------------------------- # +# 3. Save the ice file +# +# Ice file name will have ensemble member number if WW3ATMIENS=T +# and only WAV_MOD_ID if WW3ATMIENS=F +# + if [ "${WW3ATMIENS}" = "T" ] + then + icefile=${WAV_MOD_TAG}.${WAVEICE_FID}.$cycle.ice + elif [ "${WW3ATMIENS}" = "F" ] + then + icefile=${COMPONENTwave}.${WAVEICE_FID}.$cycle.ice + fi + + set $setoff + echo " Saving ice.ww3 as $COMOUT/rundata/${icefile}" + set $seton + cp ice.ww3 $COMOUT/rundata/${icefile} + rm -f ice.ww3 + +# --------------------------------------------------------------------------- # +# 4. Clean up the directory + + set $setoff + echo " Removing work directory after success." + set $seton + + cd .. + rm -rf ice + + set $setoff + echo ' ' + echo 'End of waveice.sh at' + date + +# End of waveice.sh --------------------------------------------------------- # diff --git a/ush/wave_tar.sh b/ush/wave_tar.sh new file mode 100755 index 0000000000..edeb1994d9 --- /dev/null +++ b/ush/wave_tar.sh @@ -0,0 +1,231 @@ +#!/bin/bash +############################################################################### +# # +# This script tars the sectral or bulletin files into a single file and # +# puts it into /com. This is a separate script to enable it to be run in # +# parallel using poe. It also tars the spectral and bulletin files of the # +# old grids that are generated for backward compatibility # +# # +# Remarks : # +# - Shell script variables controling time, directories etc. are set in the # +# mother script. # +# - This script runs in the work directory designated in the mother script. # +# Under this directory it geneates a work directory TAR_$type_$ID which is # +# removed if this script exits normally. # +# - See section 0.c for variables that need to be set. # +# # +# Origination: Hendrik Tolman March 13, 2007 # +# Update log # +# Nov2019 JHAlves - Merging wave scripts to global workflow # +# # +############################################################################### +# +# --------------------------------------------------------------------------- # +# 0. Preparations +# 0.a Basic modes of operation + + # set execution trace prompt. ${0##*/} adds the script's basename + PS4=" \${SECONDS} ${0##*/} L\${LINENO} + " + set -x + + # Use LOUD variable to turn on/off trace. Defaults to YES (on). + export LOUD=${LOUD:-YES}; [[ $LOUD = yes ]] && export LOUD=YES + [[ "$LOUD" != YES ]] && set +x + + cd $DATA + postmsg "$jlogfile" "Making TAR FILE" + + + set +x + echo ' ' + echo '+--------------------------------+' + echo '! Make tar file |' + echo '+--------------------------------+' + echo " ID : $1" + echo " Type : $2" + echo " Number of files : $3" + [[ "$LOUD" = YES ]] && set -x + + +# 0.b Check if type set + + if [ "$#" -lt '3' ] + then + set +x + echo ' ' + echo '********************************************' + echo '*** VARIABLES IN ww3_tar.sh NOT SET ***' + echo '********************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "TYPE IN ww3_tar.sh NOT SET" + exit 1 + else + ID=$1 + type=$2 + nb=$3 + fi + + filext=$type + if [ "$type" = "ibp" ]; then filext='spec'; fi + + rm -rf TAR_${filext}_$ID + mkdir TAR_${filext}_$ID +# this directory is used only for error capturing + +# 0.c Define directories and the search path. +# The tested variables should be exported by the postprocessor script. + + if [ -z "$cycle" ] || [ -z "$COMOUT" ] || [ -z "$WAV_MOD_TAG" ] || \ + [ -z "$SENDCOM" ] || [ -z "$SENDDBN" ] || [ -z "${STA_DIR}" ] + then + set +x + echo ' ' + echo '*****************************************************' + echo '*** EXPORTED VARIABLES IN ww3_tar.sh NOT SET ***' + echo '*****************************************************' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "EXPORTED VARIABLES IN ww3_tar.sh NOT SET" + exit 2 + fi + + cd ${STA_DIR}/${type} + +# --------------------------------------------------------------------------- # +# 2. Generate tar file (spectral files are compressed) + + set +x + echo ' ' + echo ' Making tar file ...' + + count=0 + countMAX=5 + tardone='no' + + while [ "$count" -lt "$countMAX" ] && [ "$tardone" = 'no' ] + do + + [[ "$LOUD" = YES ]] && set -v + # JY nf=`ls $ID.*.$type | wc -l | awk '{ print $1 }'` + nf=`ls | awk '/'$ID.*.$filext'/ {a++} END {print a}'` + if [ "$nf" = "$nb" ] + then + tar -cf $ID.$cycle.${type}_tar ./$ID.*.$filext + exit=$? + set +v; [[ "$LOUD" = YES ]] && set -x + + if [ "$exit" != '0' ] + then + set +x + echo ' ' + echo '***************************************** ' + echo '*** FATAL ERROR : TAR CREATION FAILED *** ' + echo '***************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : TAR CREATION FAILED" + exit 3 + fi + + if [ -f "$ID.$cycle.${type}_tar" ] + then + tardone='yes' + fi + else + set +x + echo ' All files not found for tar. Sleeping 10 seconds and trying again ..' + [[ "$LOUD" = YES ]] && set -x + sleep 10 + count=`expr $count + 1` + fi + + done + + if [ "$tardone" = 'no' ] + then + set +x + echo ' ' + echo '***************************************** ' + echo '*** FATAL ERROR : TAR CREATION FAILED *** ' + echo '***************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : TAR CREATION FAILED" + exit 3 + fi + + if [ "$filext" = 'spec' ] + then + if [ -s $ID.$cycle.${type}_tar ] + then + file_name=$ID.$cycle.${type}_tar.gz + /usr/bin/gzip -c $ID.$cycle.${type}_tar > ${file_name} + exit=$? + + if [ "$exit" != '0' ] + then + set +x + echo ' ' + echo '***************************************************** ' + echo '*** FATAL ERROR : SPECTRAL TAR COMPRESSION FAILED *** ' + echo '***************************************************** ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : SPECTRAL TAR COMPRESSION FAILED" + exit 4 + fi + fi + else + file_name=$ID.$cycle.${type}_tar + fi + +# --------------------------------------------------------------------------- # +# 3. Move data to /com + + set +x + echo ' ' + echo " Moving tar file ${file_name} to $COMOUT ..." + [[ "$LOUD" = YES ]] && set -x + + cp ${file_name} $COMOUT/station/. + + exit=$? + + if [ "$exit" != '0' ] + then + set +x + echo ' ' + echo '************************************* ' + echo '*** FATAL ERROR : TAR COPY FAILED *** ' + echo '************************************* ' + echo ' ' + [[ "$LOUD" = YES ]] && set -x + postmsg "$jlogfile" "FATAL ERROR : TAR COPY FAILED" + exit 4 + fi + + if [ "$SENDDBN" = 'YES' ] + then + set +x + echo ' ' + echo " Alerting TAR file as $COMOUT/station/${file_name}" + echo ' ' + [[ "$LOUD" = YES ]] && set -x + $DBNROOT/bin/dbn_alert MODEL OMBWAVE $job $COMOUT/station/${file_name} + fi + +# --------------------------------------------------------------------------- # +# 4. Final clean up + + cd $DATA + + set +x; [[ "$LOUD" = YES ]] && set -v + rm -f ${STA_DIR}/${type} + set +v + + echo ' ' + echo 'End of ww3_tar.sh at' + date + +# End of ww3_tar.sh ----------------------------------------------------- #