Hi all,
how exactly are updated parent grid(s) in AGRIF configurations defined and used in NEMO 5 and is it the same for NEMO 4.2.2?
For different puposes (e.g. runoff distribution, restart masking) I used masks from mesh_mask files for the parent grid(s) in AGRIF configurations. First of all, these masks differ depending on whether created using the DOMAINcfg tool or produced by NEMO during integration (ln_meshmask = .true.
). Which are the “correct” masks?
From older NEMO versions (e.g. 3.6) updated masks on the parent grid(s) were treated to (at least along coasts) ensure wet grid cells on the parent grid(s), whereever a wet grid cell is located on the respective nest grid. This seems to not be the case any longer. Is there some documentation on how exactly the parent grid(s) are treated?
Thank you!
Franziska
Hi Franziska,
There is no documentation, but you can have a look at the following file: agrif_dom_update.F90. You’ll see a parameter in the module header, rminfrac
that defines the minimum coarse grid area fraction that should be wet in order to unmask a parent grid cell. It’s 98% by default. You can play with that, and eventually set it to 0 so that you would retrieve what you were used to.
Thank you Jérôme!
I will have a look into it and its effect (but will need some time).
I’m still a bit worried about the different masks produced through DOMAINcfg and NEMO itself. Can you say something about that?
Hi Franziska,
In the past I also found that when using the DOMAINcfg, the land-sea masking applied in the mesh_mask.nc differs from the one used in the domain_cfg.nc, and I think this is consistent with what you are reporting, since the mesh_mask.nc created during a NEMO run will be generated using the fields from the input domain_cfg.nc file. I should have addressed this problem here (see small bug 1): mbathy might be not correct when using AGRIF with ln_vert_remap and ln_read_cfg activated (#431) · Issues · NEMO Workspace / Nemo · GitLab - which version of DOMAINcfg are you using?
Hi Diego,
thanks for the hint!
I was using DOMAINcfg from tagged versions 4.2.2 and 5.0.
Again, as stated elsewhere, I’m a bit unsure of anything produced with DOMAINcfg tool and ln_read_cfg=T
. My advice is to start over from coordinates and topography.
What I do is using ln_read_cfg=F
and providing coordinates and bathymetry for the outermost parent grid and an external bathymetry (GEBCO) for the nest(s) to be interpolated by DOMAINcfg (compiled with key_agrif).
Is this what you’d advice Jérôme?
Hi Jérôme,
I think the small bug 1 I am refering in the ticket was affecting the DOMAINcfg tool everytime AGRIF was used, indipendentently from the ln_read_cfg=T
flag. This was because when using AGRIF, dom_wri
was called before Agrif_Step_Child(agrif_boundary_connections)
, Agrif_Step_Child(agrif_recompute_scalefactors)
and Agrif_Step_Child_adj(agrif_update_all)
, preventing the mesh_mask.nc
of the parent model from having the modications needed to make the bathymetry of the parent and the zooms consistent (contrarily to the domain_cfg.nc
). You can see some tests I did for this ticket here Resolve "mbathy might be not correct when using AGRIF with ln_vert_remap and ln_read_cfg activated" (!529) · Merge requests · NEMO Workspace / Nemo · GitLab.
Hi Franziska,
I just checked and the fixes of the ticket I mentioned above are not included in v4.2.2 while they are in v5.0.
This should mean that
-
the DOMAINcfg@v4.2.2 with AGRIF generates domain_cfg.nc
and mesh_mask.nc
files for the parent model with fields (e.g., bathymetry, e3*, masks) that are not consistent between each other (domain_cfg.nc
being the correct one) - for the reason, see comment above.
-
the DOMAINcfg@v5.0 with AGRIF generates domain_cfg.nc
and mesh_mask.nc
files for the parent model with fields (e.g., bathymetry, e3*, masks) that are consistent between each other.
In theory, this should imply that if you use the DOMAINcfg@v5.0 you should not have the mismatch between the mesh_mask.nc
generated with the DOMAINcfg tool and the one from a NEMO run - is this consistent with your experiments?
Hi Diego,
thank you for checking the fixes.
Unfortunately this is not consistent with my experiments.
I re-checked for tmasks:
On the innermost nest grid (which is not updated during production), these are identical produced by DOMAINcfg and a NEMO run
On all other grids which represent a parent of any nest (and should be subject to updates), these differ depending on their origin.
[I will get back to this issue in mid June - any in between clarification is very much appreciated.]
Thanks Franziska! Would it be possible to see a plot of the differences that you are finding?
Hi Diego,
back to this topic…
Here’s a figure showing parts of the global parent grid from a configuration with a child grid covering the South Atlantic and parts of the Indian Ocean. Depicted is the difference between tmask built by NEMO minus tmask built using DOMAINcfg for two different depths.
Hi Jérôme,
I am still wondering about the change from “opening” parent cells everywhere to “almost nowhere” between the different code versions. What is the rational behind this choice? Would changing anything here impact on volume conservation, divergence conservation between the grids (proper vertical velocities on the parent grid) or anything else?
Thanks a lot for more insights into this.
Hi Franziska,
The rationale is quite simple: Let’s take for example a 3:1 refinement. Imagine that one parent cell is covered by just 1 child grid cell with a 9 m depth. Conserving volume between cells (without changing areas) leads to 1m depth coarse grid. You can get into trouble during the integration of the barotropic mode, which I recall, is done everywhere over the coarse grid.
On the other hand, not systematically unmasking a coarse grid that has at least 1 wet child grid point prevents from divergence conservation when doing the child to coarse feedback. So there’s a kind of imbalance at each startup of the 2d mode over the parent grid due to spurious divergences near the coastline. That’s the reason why I implemented this “double barotropic open boundary” option PARENT_EXT_BDY
which simply assumes persistence over the coarse grid overlapping region during barotropic mode integration. But it seems that the parameters associated to this option have to be better explored (so it has been disabled in NEMO 5.x).
Hi Franziska,
Thanks a lot for the plots - could you please check if there are any differences in the grids (lat and lon)?
1 Like
Hi Diego,
I checked glam[tuv] and gphgi[tuv]. glam is identical in both, gphi differs at certain cells along the “northenmost” row (only for t and u). There are no differences in the region of the nest.
Thanks Franziska! One more thing - what if you compare the domain_cfg,nc for the DOMAINcfg tool with the one from a NEMO run? do they have variables that differ?
Hi Diego,
I’m using coordinates and bathymetry files as input for DOMAINcfg to build domain_cfg.nc which I then use for NEMO. Comparing variables in coordinates.nc and domain_cfg.nc shows differences in the nest region - the update to the grid apparently works. However, this is not reflected in the mesh_mask.nc as produced by DOMAINcfg. Comparing to a mesh_mask.nc build by DOMAINcfg but without introducing a nest shows identical masks (which should not be the case).
From what I see, the mesh_mask.nc file appears to be written before the update is applied.
Hi Franziska,
Sorry but I am a bit confused now. At the beginning of this discussion you were reporting that “masks differ depending on whether created using the DOMAINcfg tool or produced by NEMO during integration” (se your original post).
Now you are reporting that “a mesh_mask.nc build by DOMAINcfg but without introducing a nest shows identical masks” to the ones of a mesh_mask.nc build with the same tool but with nests. I agree this shouldn’t happen, but this is different from your original post, could you please clarify?
Regarding the problem you report in this last post: can I ask you again which version of DOMAINcfg are you using please? Since you mentioned that “the mesh_mask.nc file appears to be written before the update is applied”, then my guess is that you are using v4.2.2, and I should have fixed this in the ticket I mentioned above and it should be available in v5.0.