-
Notifications
You must be signed in to change notification settings - Fork 118
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Add ability to read increment files on native cubed sphere grid #342
Add ability to read increment files on native cubed sphere grid #342
Conversation
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is good. One thing I do recommend is checking for the consistency of the increments for both delp and the tracers; this should be OK as long as JEDI doesn't do any transformations on these variables.
Are there any other obstacles to getting this PR reviewed and merged? |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
I think this is good. Thanks for submitting this.
tools/cubed_sphere_inc_mod.F90
Outdated
call open_ncfile( trim(filename), ncid ) | ||
|
||
! Read increments | ||
call get_var5_r4( ncid, 'u_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%ua_inc ) | ||
call get_var5_r4( ncid, 'v_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%va_inc ) | ||
call get_var5_r4( ncid, 'T_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%temp_inc ) | ||
call get_var5_r4( ncid, 'delp_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%delp_inc ) | ||
if ( .not. Atm%flagstruct%hydrostatic ) then | ||
call get_var5_r4( ncid, 'delz_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%delz_inc ) | ||
end if | ||
|
||
! Read tracer increments | ||
do itracer = 1,ntracers | ||
call get_tracer_names(MODEL_ATMOS, itracer, tracer_name) | ||
call check_var_exists(ncid, trim(tracer_name)//'_inc', ncerr) | ||
if ( ncerr == 0 ) then | ||
call get_var5_r4( ncid, trim(tracer_name)//'_inc', isc,iec, jsc,jec, 1,npz, itile, 1, increment_data%tracer_inc(:,:,:,itracer) ) | ||
end if | ||
end do | ||
|
||
! Close increment file | ||
call close_ncfile(ncid) |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
You may get a complaint from operations that this is too disruptive to the filesystem, especially for large numbers of mpi-ranks. You have every core opening the file, performing reads and then closing the file. While I know you are using these embedded I/O routines from the dycore, they were originally put in as hacks a long time ago and generally won't play nice now that we've scaled up to thousands or cores on distributed filesystems.
You'd be better off using the fms2_io routines which are already domain aware (suited for native grid) and will read in the correct increment for a given mpi-rank. The fms2_io routines have recently been modified by UFS contributors to use MPI-IO for reads of NetCDF4 files (FMS PR#1477).
@bensonr See one of my original
I would really like to use fms2_io, but in addition to the above problem, I could not find an example to work from where someone does a 5D read from a cubed-sphere history file, let alone using domain awareness features. Every example I found was for reading FMS restart files. Moreover, as of right now, Gaussian grid increment files are read, both IAU and non-IAU, using the sim_nc_mod.F90 module routines anyway. |
@bensonr Are you aware of an example of such a 5D read with FMS2 using domain awareness capability? |
@DavidNew-NOAA I am curious: what is the use case for a 5D I/O statement? Usually, 4d data (x,y,z, tracer_index) is sufficient, and the time dimension is read one slice at a time. Thanks |
@lharris4 Increment files in cubed-sphere history format have dimensions (x,y,z,tile,time). The time dimension is always length=1, but it exists. Tiles are read one at a time. |
@laurenchilutti @lharris4 @bensonr @XiaqiongZhou-NOAA I made changes to this PR, now, as requested, using FMS calls to read the increment files. The increment files are now split into one file for each tile. When the namelist parameters Can you please review these changes? Thanks |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
@DavidNew-NOAA - thanks for being so accommodating with the suggestions to use the existing IO layers within FMS.
According to the comments, may address issue #188 (close when merged). |
@laurenchilutti Is there anything I need to do to move this PR along now that everything is approved? |
Once approvals are done, testing needs to be done on the UFS PR associated with this change. And then someone usually tells me to merge when the UFS PR is ready to merge. @jkbk2004 is there anything @DavidNew-NOAA needs to do to get this merged? |
If I'm not mistaken there following error in testing is due to 7c3102f : https://github.com/NOAA-EMC/fv3atm/actions/runs/10304307448/job/28522464673?pr=837#step:6:2384 Does anyone have some insight on this? |
The commit you reference was just merged in today and is accompanied by an FV3atm PR that was also merged today: NOAA-EMC/fv3atm#798. I would make sure that your FV3atm PR has merged in the latest from the develop branch. |
@DavidNew-NOAA We need to update the namelist file option (increment_file_on_native_grid) at weather model regression test cases. Can you create a pull request to https://github.com/ufs-community/ufs-weather-model ? |
I missed. @DavidNew-NOAA has a draft pr ufs-community/ufs-weather-model#2304. I think we can make the PR ready with namelist parameter update and pre-test. |
@jkbk2004 I added the namelist parameter in NOAA-EMC/fv3atm#837 for FV3. Where do I need to modify UFS? I was able to test successfully with the namelist parameter in FV3 |
@bensonr testing is complete on UFS-WM PR #2389. Can you please merge this PR for us? |
Copying @laurenchilutti as well. We're ready to merge this PR. |
…ng in rt.sh (2388) + Add ability to read increment files on native cubed sphere grid (2304) (#2389) * UFSWM - Unify CDEPS gfs, cfsr, and gefs datm datamodes. Add ability to read increment files on native cubed sphere grid. Update rt scripts to fix checking the job exit status from Slurm and PBS schedulers. * FV3 - Add namelist parameter for cubed sphere increment read and update atmos_cubed_sphere hash for GFDL Atmospheric Cubed Sphere PR #[342](NOAA-GFDL/GFDL_atmos_cubed_sphere#342) * atmos_cubed_sphere - Add ability to read increments from files on native cubed sphere grid * CDEPS - Unify CDEPS gfs, cfsr, and gefs datm datamodes. Rename CDEPS-related FILENAME_BASE to FILEBASE_DATM in tests. Sunset gfs_hafs datamode.
Description
This PR and its companion FV3 PR, #837, add the ability to read increments on the native cubed sphere grid, either in the IAU framework or on restart.. This option can be turned on by an input namelist option, "increment_file_on_native_grid", which defaults to .false..
The motivation for this PR is that we would like to read cubed sphere increments in the Global Workflow used by NOAA-EMC for the new JEDI DA system. However, we cannot yet produce IAU increments with JEDI, so we need to apply increments for now at restart. This PR is based off of another PR, #274, which has been largely rewritten to apply increments at restart in addition to IAU. (cc @mark-a-potts)
Overview of code changes:
A new module, "cubed_sphere_inc_mod.F90", has been added to the "tools" directory, and it has a subroutine, "read_cubed_sphere_inc", that will read the increment into an "increment_data_type" structure. This structure was formerly the "iau_internal_data_type" data structure in "fv_iau_mod.F90", but it's been moved into the new module to avoid circular dependencies and renamed it to reflect the fact that it's not just for IAU.
"read_cubed_sphere_inc" is called in either "fv_iau_mod.F90" for "fv_treat_da_inc.F90", depending on whether or not IAU is used. If not using IAU, a new subroutine, "read_da_inc_cubed_sphere", in "fv_treat_da_inc.F90" is called in "fv_restart.F90" to read and apply the increment. This involves some halo updates and a physics call so the A-grid wind increment can be applied to the D-grid wind.
Note: "cubed_sphere_inc_mod.F90" could possibly be merged into "fv_treat_da_inc.F90", so I'd be interested to hear feedback about that.
Fixes # (issue)
N/A
How Has This Been Tested?
I ran a few of sets of experiments that involved three C96, 6-hour forecasts with identical initial conditions. These experiments were run on RDHPCS Hera.
First, I used JEDI to produce two increments, identical except that one was on the native grid and one was interpolated to the Gaussian grid, and then I ran two forecasts, one for each increment (not with IAU). In the third experiment, I ran the forecast without an increment. I then subtracted the free (no increment) forecast model fields from the model fields of the forecasts using the increments. These residuals allowed me to isolate the impact of the increments on the forecast. Finally, I plotted the residuals and their differences and verified visually that the impact of the increments were identical in terms of large-scale patterns. Naturally there are differences between the two resduals because of interpolation of the increment, but these appeared to produce mostly noise. However, the fact that there are (expected) differences is why I had to verify my results visually.
The following is a plot of "o3mr" at model level 20 (statosphere), using OMI satellite observations for the increment:
Here is another plot of "ua" at model level 20, using satwind observations for the increment:
Checklist:
Please check all whether they apply or not