ufs-community / regional_workflow Goto Github PK
View Code? Open in Web Editor NEWTHIS REPOSITORY IS NOW DEPRECATED; SEE UFS SRW APP FOR CURRENT CODE
Home Page: https://github.com/ufs-community/ufs-srweather-app
License: Other
THIS REPOSITORY IS NOW DEPRECATED; SEE UFS SRW APP FOR CURRENT CODE
Home Page: https://github.com/ufs-community/ufs-srweather-app
License: Other
For each available CCPP physics suite, clean up the template namelist files for the forecast model to make it easier to compare (xxdiff) these files (e.g. make the indentation the same, etc).
Import the regional_grid into the build system and the corresponding source code will be included in a branch of UFS_UTILS. Make sure it supports theia, wcoss_dell_p3 and wcoss_cray.
Hera will be the replacement for Theia; Theia is going away at the end of September. This port should not require a large amount of work, as the system architecture of Hera is apparently quite similar to that of Theia. In addition, Jim Abeles has already run several tests with the regional FV3 on Hera.
HAFS workflow has a script which effectively generalizes the parallel run commands, which enhances portability and eases scripting.
Currently, a fully cycled FV3SAR ensemble based hybrid GSI analysis and EnKF system has been developed and evaluated. This issue is to merge the modified scripts and additional scripts to the current regional-workflow while following the latter's format/standard when possible.
Identify what is one-to-one and can be merged, what is unique to one or the other and should remain separate or how they could be generalized to work for either
If you would like to build the html and pdf version of the documentation located in doc/user_guilde, you (or your sys admin) will need to install Sphinx on your desktop/latop from:
http://www.sphinx-doc.org/en/master/usage/installation.html
You don't need to run quickstart, since those files have already been created and are in the doc/user_guide directory. Once installed, you can cd doc/user_guide and type 'make html'.
Begin process of introducing workflow documentation to Wiki page. May also wish to consider adding some guthub notes as well for obtaining/forking/pull requests/code management.
Add some generality for handling multiple domains (e.g., CONUS, Hawaii, Alaska, ...)
[ I have a non-GitHub branch that adds this functionality - just kicking the tires on GitHub before eventually committing my branch here. ]
The HAFS-DA workflow, for post-processing purposes, generates the XML flat file from the XML file such that it enables the community user to modify which variables they wish to dump via UPP to GRIB2 rather than having to edit the flat_file directly.
export GESROOT=${mainroot}/${comdir}/${USER}/com
should be
export GESROOT=${mainroot}/${comdir}/${USER}/nwges
Sam Trahan is the first user we've had who's using a bash login shell, and the experiment generation script(s) are apparently failing for him. Try this and make necessary fixes.
Gerard and I talked about the potential to point the run shell script to the EMC_post module file, thus avoiding this problem in the future altogether, and allowing us to have a single module file for the build and execution.
Add capability to run SAR workflow components outside of ROCOTO (as stand-alone shell scripts).
Take a look at DG being put together for the UFS v0.1 release and update it as needed for the SAR workflow repository
Create a branch in regional_workflow to update paths to scripts. Once fully tested and everyone is aware, move to community_develop branch.
Current setting NRST in exfv3cam_sar_fcst.sh. That is pretty deep for a user. Could this be set in the jjob script for NCO but changed (overwrite) above that in a config script of some sort by a user if they want?
Choice of nodes in exfv3cam_sar_fcst.sh needs to be moved out of the run script and into an initial configure script (e.g., config.sh in the community workflow). We need to ensure that the checks in setup.sh (in the community workflow) can ensure that the choice of nodes will be compatible with PE_MEMBER01, layout_x, layout_y, etc.
This involves minor changes to the NOAA-EMC/regional_workflow repo. New modulefiles for Cheyenne will be needed for the VLAB UFS_UTILS repo. The NCEP gfsio library needs to be added the NCAR/NCEPlibs github repo, and these libraries will have to be built on Cheyenne to include the additional libraries used by the utilities.
Fix the way the launch script (ush/launch_FV3LAM_wflow.sh) deals with multiple cycles so that it doesn't set the workflow status to "SUCCESS" after only one cycle has completed successfully.
Recent work by the HAFS team has produced the capability for making the write_grid_component working for any output gride/domain (covering the whole native computation domain or totally within the native computation grid).
Additional changes may be needed to the UPP to handle missing values.
Test to see if the output can be directly output on a lambert conformal grid using the write component (setting output_grid) rather than doing this at the bottom of the post script.
If it is possible that can be removed and the script would be simplified. This is what the community_develop script does.
If it is not possible, we could add an environment variable to include an if test to exit out prior (for community users).
Add a test for running the experiment/workflow generator in NCO mode (i.e. with RUN_ENVIR set to "nco") on one of the EMC grids.
Automate the way the ozone parameterization is chosen (and thus which of the two ozone production/loss files gets copied from or linked to in the system's fixed-files directory).
A Python utility that uses the f90nml package should allow for more flexibility in how the Fortran namelists are managed, eliminating the need for templating within the namelists, and for duplication of common parameters.
The utility will replace calls to set_file_param in the generate_FV3SAR_wflow.sh, and will be available to call from any ex-scripts, too.
Replacing RUC land surface model in GSD_SAR suite
@BenjaminBlake-NOAA I was looking in to running the workflow as it exists on the develop branch. I was told that you or @MatthewPyle-NOAA may be working on an XML that will run the full workflow on Theia. Is this correct? If not I will build an XML myself for Theia and commit it to the develop branch.
Remove hard-coded global fixed-file names in the forecast model template namelist files (ush/templates/input.nml.${CCPP_PHYS_SUITE}) and instead set (most of) these file names in the experiment configuration file.
If the make_grid, make_orog, or make_sfc_climo tasks are turned off, the workflow launch script (ush/launch_FV3SAR_wflow.sh) should still be able determine whether or not the workflow completed successfully (and not just indefinitely leave its status as "IN PROGRESS").
Need to make sure input.nml includes stochastic settings but keep them set to False in the default file:
Update diag_table to include stochastic block: in ush/template dir: diag_table.FV3_GSD_SAR
##Stochastic physics
"gfs_phys", "sppt_wts", "sppt_wts", "fv3_history", "all", .false., "none", 2
"gfs_phys", "skebu_wts", "skebu_wts", "fv3_history", "all", .false., "none", 2
"gfs_phys", "skebv_wts", "skebv_wts", "fv3_history", "all", .false., "none", 2
"dynamics", "diss_est", "diss_est", "fv3_history", "all", .false., "none", 2
"gfs_phys", "shum_wts", "shum_wts", "fv3_history", "all", .false., "none", 2
For the make_grid_orog job, a check will need to be added to see if the grid/orog fix files one needs have already been generated. If so, the job will exit, and if not, the job will run. The make_sfc_climo job has a task dependency on the make_grid_orog job completing.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.