-
Notifications
You must be signed in to change notification settings - Fork 35
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Reorganization of physics repository #99
Reorganization of physics repository #99
Conversation
… into feature_reorg_physics
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Tks for the initial work on the re-organization, which I understand is an outcome of discussions held during the CCPP Visioning Workshop (NCAR/ccpp-framework#476). Looks great overall. A few comments below:
- I think the directory for the C3 convective scheme should be called C3 (not CCC) for consistency with the scheme name.
- I see there are no subdirectories for the various schemes in GWD. Is that because it is too hard to separate them? Should there be subdirs?
- There is a directory under "Interstitials" called GFS. At minimum, it should be UFS instead of GFS. However, perhaps best to call is UFS_SCM_NEPTUNE, since all three hosts use these files.
- "Land" contains lake models.
- Putting LSM in "Land" and creating 3 other directories: "Lake", "Ocean", and Sea_Ice. If you don;t want to subvidide so much, then create a "Sfc_Models", or something like that, and place land, lake, ocean, and ice in there.
- It seems the GFSL surface layer parameterization (gfdl_sfc_layer, module_sf_exchcoef) ended up in the MP subdir. Should go to SFC_layer
- Ocean, ice ended up in SFC_layer, not right
- SHOC, MYNN are in the PBL subdir. They do more than PBL, as they also tackle shallow convection. I am okay putting them in the PBL dir, just wanted to point out that the classifications can be blurry. E.g., CLUBB would cross various classification boundaries.
@ligiabernardet Thanks for your review and comments. I've incorporated them into this PR. |
@dustinswales nice work on this large PR. The subroutine progsigma_calc.F90 is called from both SAS and C3, from samfdeepfcnv.F90, samfshalcnv.F90, cu_c3_deep.F90 and cu_c3_shal.F90. I see you have it in both CONV/ and CONV/C3... wouldn't it be better to just have process specific directories such as CONV, PBL etc. and not scheme directories? I'm also working on the PR for cleaning the convection pre/post, and in that case I had hoped all the convection schemes could use the same pre/post - and it would be more natural to have all the routines in the same directory. |
@lisa-bengtsson progsigma_calc.f90 is not in CONV/C3 or CONV/SAS, but above it in CONV (Since it is shared across convection schemes)? |
@dustinswales ok, thanks for clarifying. My vote is still to not use scheme specific directories, or at least have as a goal to unify the pre-and post and bring those up. |
@lisa-bengtsson I didn't want to open up a can of worms by introducing interstitial cleanup along with the reorganization of the repository, but why not? We have 7 convection schemes, all of which use suite-level pre/post interstitials (e.g. GFS_DCNV/SCNV_generic_pre/post). In addition, four of these also have scheme-level pre/posts (e.g cu_gf_driver_pre/post).
As for the _post for GF and C3... We don't have a defined place for these types of computations, it seems these pieces find their way into GFS_MP_generic_post, I assume this is simply because it's at the end of all the SDFs used for UFS applications? If so, this is a hardcoded decision on the scheme ordering built into the SDFs that we should think about unwinding? Maybe we should create a new scheme, something like GFS_physics_post, to gather calculations that need to be called at the end of the physics loop? Maybe also host-specific diagnostics (e.g. dtend and qtend)?
|
@dustinswales yes, that is a good summary of the PR I'm working on. In the past I added some convection post in GFS_MP (for SAS), and I added some post in cu_c3_post (for C3). I would like to make that consistent and as you say two other convection scheme have the same post, so it should be included in the clean up. A new routine GFS_physics_post is the best way forward. But that is separate from this PR. I wonder, what is your main motivation for splitting sub-directories further into schemes, rather than staying at the level of CONV/ PBL/ MP/ etc? |
@lisa-bengtsson We can suss out the convection interstitials later (As promised, when the time comes I will create the GFS_physics_post scheme and push it to your convection cleanup PR) There was a consensus around this organization at the workshop, but it's not sacred by any means. Personally I think having process shared code at a level above the parameterizations makes sense, but is not critical. The most important part of this PR, IMO, is to cleanly separate host-specific and scheme-specific code. |
We could squash the subdirectores. Personally I kinda like them, but don't really care either way. In general, the pre/posts are used to map from host-to-scheme and scheme-to-host, so all of these _pre/post truly belong to the host. A scheme isn't born with pre/post routines, we write them to couple to them to the host for our needs. |
Ok, if the majority are in favour of subdirectories, I will not squash that plan. Yes, I agree with you of the role of pre/post, although the line on what can go in a scheme or to an interstitial is a little fuzzy. It is a good idea to get this draft PR out for comments. Thank you. |
… into feature_reorg_physics
@ericaligo-NOAA It's up to the scheme developer where there code lives and how they manage their development. |
I believe what makes ufs-weather-model complicated are the many submodules. Adding more submodules is going to make it even more complicated. I prefer to have all physics routines (convection, mp, radiation, pbl) all in the physics directory. |
It's common practice to use submodules to manage multi-component software systems. |
@dustinswales Have you tested this with branches of fv3atm and ccpp-scm yet? Would you like help? |
@grantfirl I ran the RT's on Hera and a bunch of tests failed, /scratch1/BMC/gmtb/Dustin.Swales/UFS/reorg_phys/ufs-weather-model/tests/logs/RegressionTests_hera.log. |
@dustinswales Dustin, I have noticed that snow depth variable has a wrong definition: |
@tanyasmirnova The standard_name is probably wrong as well. Right? If so we would need to correct that in the physics and the host. |
@dustinswales Yes, the standard name is wrong too. This makes it messy as it used in many physics subroutines |
@grantfirl All UFS RTs are passing on Hera Intel, so this is ready to go. (I needed to make some changes in CMEPS.) |
Great news, @dustinswales. Thanks for debugging. We'll try to add this to the merge queue tomorrow. |
@grantfirl testing is complete on UFS-WM PR #2035. Can you please merge this ccpp-physics sub-pr? |
…eta files (#581) Because of the change in directory structure for CCPP physics (ufs-community/ccpp-physics#99), there are now `.meta` files at different levels in the directory tree. The `ccpp_track_variables.py` script needs the location of these `.meta` files as an input argument to the script, but the call to `glob.glob` in the script does not use the `recursive=True` argument, so even if the user passes in the argument `-m './physics/physics/**/'` (which should include all subdirectories), the call to `glob.glob` only searches one level. Our simple test case only has `.meta` files at a single directory level, so we never caught this issue. Simply adding the `recursive=True` argument fixes this issue.
This PR contains a reorganization of the ccpp-physics repository. No changes to the results are introduced here.
Summary of changes: