Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Aquaplanet external gravity wave #187

Open
wants to merge 21 commits into
base: main
Choose a base branch
from

Conversation

jeremy-lilly
Copy link

This PR creates a new ocean task for testing the temporal convergence rate of time-stepping schemes in MPAS-O. The case is that of a simple external gravity wave on an aquaplant. The normal velocity is initialized to zero and the layer thickness is initialized as a Gaussian bump.

The user selects a time-stepping scheme and a series of time-steps to run the test. The first listed time-step will be used to produce a reference solution that other runs will be compared to.

The PR also implements a version of the task for LTS schemes, in particular for the new FB-LTS scheme from this PR: E3SM-Project/E3SM#6224. Convergence plots produced by this task for FB-LTS are given below.

convergence_layerThickness
convergence_normalVelocity

Note that the external gravity wave case was chosen so as to have as few terms in the tendencies as possible -- because of this, both the LTS and FB-LTS schemes show their "true" order of convergence, rather than the first order convergence caused by the operator splitting that both the LTS and FB-LTS codes implement.

Checklist

  • User's Guide has been updated
  • Developer's Guide has been updated
  • API documentation in the Developer's Guide (api.md) has any new or modified class, method and/or functions listed
  • Documentation has been built locally and changes look as expected
  • Testing comment in the PR documents testing used to verify the changes
  • New tests have been added to a test suite

@mark-petersen mark-petersen added ocean Related to ocean tests or analysis E3SM PR required The polaris changes won't work with the current E3SM-Project submodule and require an update labels Mar 11, 2024
@mark-petersen
Copy link

mark-petersen commented Mar 11, 2024

@jeremy-lilly thanks for your work on this, it looks beautiful! Could you add a documentation page to the user's guide, similar to this?
https://github.com/E3SM-Project/polaris/blob/main/docs/users_guide/ocean/tasks/inertial_gravity_wave.md
Also, it is best if the history is just simple commits on some recent branchpoint from main. It looks like you pulled from main here. Could you fix that?

@mark-petersen
Copy link

Trying it out:

./configure_polaris_envs.py  --conda /usr/projects/climate/mpeterse/miconda3
source load_dev_polaris_0.3.0-alpha.1_chicoma-cpu_gnu_mpich.sh
polaris list
...
  45: ocean/spherical/icos/ext_grav_wav_local_time_step
  46: ocean/spherical/icos/ext_grav_wav_global_time_step
  47: ocean/spherical/qu/ext_grav_wav_local_time_step
  48: ocean/spherical/qu/ext_grav_wav_global_time_step

polaris setup -p ~/repos/E3SM/lts-fb-jer-pr/components/mpas-ocean -w $r/230311_polaris_ext_grav_wave -n 45 46 47 48

salloc -N 1 -t 2:0:0 --qos=debug --reservation=debug --account=t24_coastal_ocean
source load_dev_polaris_0.3.0-alpha.1_chicoma-cpu_gnu_mpich.sh
cd $r/230311_polaris_ext_grav_wave
polaris serial

This worked similarly on chicoma, perlmutter, and chrysalis. I am using a compiled executable from E3SM-Project/E3SM#6224.

Output is as follows:

(dev_polaris_0.3.0-alpha.1) pm:230311_ltsfb_test_polaris$ polaris serial
ocean/spherical/icos/ext_grav_wav_local_time_step
  * step: icos_base_mesh_120km
          execution:        SUCCESS
          runtime:          0:00:40
  * step: icos_init_120km_local_time_step
          execution:        SUCCESS
          runtime:          0:00:01
  * step: icos_init_lts_120km_local_time_step
          execution:        SUCCESS
          runtime:          0:00:15
  * step: icos_forward_120km_local_time_step_10s
          execution:        SUCCESS
          runtime:          0:01:17
  * step: icos_forward_120km_local_time_step_120s
          execution:        SUCCESS
          runtime:          0:00:13
  * step: icos_forward_120km_local_time_step_240s
          execution:        SUCCESS
          runtime:          0:00:25
  * step: icos_forward_120km_local_time_step_480s
          execution:        SUCCESS
          runtime:          0:00:44
  * step: analysis
          execution:        SUCCESS
          runtime:          0:00:07
  task execution:   SUCCESS
  task runtime:     0:03:42
ocean/spherical/icos/ext_grav_wav_global_time_step
  * step: icos_base_mesh_120km
          already completed
  * step: icos_init_120kmglobal_time_step
          execution:        SUCCESS
          runtime:          0:00:00
  * step: icos_forward_120km_global_time_step_10s
          execution:        SUCCESS
          runtime:          0:01:17
  * step: icos_forward_120km_global_time_step_120s
          execution:        SUCCESS
          runtime:          0:00:17
  * step: icos_forward_120km_global_time_step_240s
          execution:        SUCCESS
          runtime:          0:00:12
  * step: icos_forward_120km_global_time_step_480s
          execution:        SUCCESS
          runtime:          0:00:15
  * step: analysis
          execution:        SUCCESS
          runtime:          0:00:07
  task execution:   SUCCESS
  task runtime:     0:02:08
ocean/spherical/qu/ext_grav_wav_local_time_step
  * step: qu_base_mesh_120km
          execution:        SUCCESS
          runtime:          0:00:17
  * step: qu_init_120km_local_time_step
          execution:        SUCCESS
          runtime:          0:00:00
  * step: qu_init_lts_120km_local_time_step
          execution:        SUCCESS
          runtime:          0:00:15
  * step: qu_forward_120km_local_time_step_10s
          execution:        SUCCESS
          runtime:          0:01:18
  * step: qu_forward_120km_local_time_step_120s
          execution:        SUCCESS
          runtime:          0:00:12
  * step: qu_forward_120km_local_time_step_240s
          execution:        SUCCESS
          runtime:          0:00:17
  * step: qu_forward_120km_local_time_step_480s
          execution:        SUCCESS
          runtime:          0:00:08
  * step: analysis
          execution:        SUCCESS
          runtime:          0:00:06
  task execution:   SUCCESS
  task runtime:     0:02:34
ocean/spherical/qu/ext_grav_wav_global_time_step
  * step: qu_base_mesh_120km
          already completed
  * step: qu_init_120kmglobal_time_step
          execution:        SUCCESS
          runtime:          0:00:00
  * step: qu_forward_120km_global_time_step_10s
          execution:        SUCCESS
          runtime:          0:01:14
  * step: qu_forward_120km_global_time_step_120s
          execution:        SUCCESS
          runtime:          0:00:11
  * step: qu_forward_120km_global_time_step_240s
          execution:        SUCCESS
          runtime:          0:00:08
  * step: qu_forward_120km_global_time_step_480s
          execution:        SUCCESS
          runtime:          0:00:07
  * step: analysis
          execution:        SUCCESS
          runtime:          0:00:04
  task execution:   SUCCESS
  task runtime:     0:01:45
Task Runtimes:
0:03:42 PASS ocean/spherical/icos/ext_grav_wav_local_time_step
0:02:08 PASS ocean/spherical/icos/ext_grav_wav_global_time_step
0:02:34 PASS ocean/spherical/qu/ext_grav_wav_local_time_step
0:01:45 PASS ocean/spherical/qu/ext_grav_wav_global_time_step
Total runtime: 0:10:17
PASS: All passed successfully!

with plots identical to those above. Beautiful!

Comment on lines +336 to +337
setting up the task (see {ref}`conda-env`). To run a task that has already
been run, it is necessary to first delete `polaris_step_complete.log` file.
Copy link
Collaborator

@xylar xylar Mar 11, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Perfect, thanks for adding this!

Copy link
Collaborator

@xylar xylar left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

@jeremy-lilly, this looks great! The main thing I noticed is that documentation is missing. Please ping me to review again once that gets added.

In the meantime, a few other small comments.

@@ -4,7 +4,7 @@
convergence_eval_time = 24.0

# Convergence threshold below which a test fails
convergence_thresh = 1.0
convergence_thresh = 0.1
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hopefully, no tests are using this default but I would be curious why it was changed here.

Comment on lines +152 to +155
dt_btr_scaling = section.getfloat('dt_btr_scaling')
dt_btr = self.dt / dt_btr_scaling
btr_dt_str = get_time_interval_string(seconds=dt_btr *
self.resolution)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I don't think we can make this change without also updating any test cases that already use the split time stepping to use this approach. Maybe none are so far in which case maybe this will be fine.

Copy link
Collaborator

@cbegeman cbegeman left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Fantastic work here! I made a few comments and suggestions. Don't feel like you have to do all the things I suggested if you don't have the time. Also, I made most of the comments before Xylar posted his, so pardon any conflicting messaging there!

I ran the non-LTS case successfully with E3SM master.

Many of the suggestions have to do with making the convergence framework changes a bit more intuitive since we originally wrote the code for space and time convergence together. It will be great to have the option to evaluate time convergence separately with your changes.

else:
for resolution in self.resolutions:
for dt in dts:
mesh_name = resolution_to_subdir(resolution)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
mesh_name = resolution_to_subdir(resolution)

Seems this isn't used

mesh_name = resolution_to_subdir(resolution)
forward = dependencies['forward'][dt]
self.add_input_file(
filename=f'{int(dt)}_output.nc',
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
filename=f'{int(dt)}_output.nc',
filename=f'{int(dt)}s_output.nc',

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Better yet, in case there is a case with fractional seconds.

Suggested change
filename=f'{int(dt)}_output.nc',
filename=f'dt{get_time_interval_string(seconds=dt}s_output.nc',

error.append(error_res)
else:
for i in range(1, len(dts)):
mesh_name = resolution_to_subdir(resolutions[0])
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Is this implying that when dts is specified, we are evaluating convergence in time only (and all steps must use the mesh at resolutions[0])? If so, I think we should put in some safeguards so that this function cannot be used incorrectly.

@@ -305,7 +382,7 @@ def compute_error(self, mesh_name, variable_name, zidx=None,

return error

def exact_solution(self, mesh_name, field_name, time, zidx=None):
def exact_solution(self, mesh_name, field_name, time, zidx=None, dt=None):
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It seems like it is somewhat of a niche case that the exact solution is just the solution at a fine time step. I would recommend overriding exact_solution just for your test case. Let me know if you need help finding an example in the code.

@@ -73,13 +86,17 @@ def __init__(self, component, name, subdir, resolution, mesh, init,
self.add_input_file(
filename='init.nc',
work_dir_target=f'{init.path}/initial_state.nc')
if graph_path is None:
graph_path = mesh.path
self.add_input_file(
filename='graph.info',
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
filename='graph.info',

Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I just noticed that this is no longer used

config_dt: {{ dt }}
config_time_integrator: {{ time_integrator }}
debug:
config_disable_thick_all_tend: false
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

We generally don't include the default options here unless we think it will lead to confusion or having a namelist option set a certain way is essential to the test. You should be able to just specify which terms are turned off (*disable* = true)

@@ -0,0 +1,81 @@
omega:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

It can be nice to have one file forward.yaml that has the config options common to both global and local time stepping and then either/both of forward_global_time_step.yaml and forward_local_time_step.yaml that contain the options that are specific to each.


init_vertical_coord(config, ds)

temperature_array = temperature * xr.ones_like(ds_mesh.latCell)
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Suggested change
temperature_array = temperature * xr.ones_like(ds_mesh.latCell)
temperature_array = temperature * xr.ones_like(latCell)

center_pt = Point(lat_center, lon_center)
for icell in range(0, n_cells):
cell_pt = Point(lat_cell[icell], lon_cell[icell])
if distance(cell_pt, center_pt) < np.pi / 4:
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider making the radius a function parameter

use_progress_bar=use_progress_bar)


def label_mesh(mesh, graph_info, num_interface, # noqa: C901
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Consider making the function to label interface cells different from the one that sets the main regions (1,2). Since we already have an LTS test in compass that may eventually make its way to polaris, it would be nice to create some code here that could be moved to a shared directory down the line and be reused by other LTS tests.

@jeremy-lilly
Copy link
Author

@xylar, @cbegeman, @mark-petersen -- thank you all for your comments and suggestions! Just wanted to let you know that I saw these and that they are on my to-do list, but I might not get to them right away. I will ping everyone when ready for a second look : ) Thanks!

@jeremy-lilly jeremy-lilly force-pushed the aquaplanet-external-gravity-wave branch from 7ce4884 to 03c72f0 Compare March 15, 2024 23:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
E3SM PR required The polaris changes won't work with the current E3SM-Project submodule and require an update ocean Related to ocean tests or analysis
Projects
None yet
Development

Successfully merging this pull request may close these issues.

5 participants