-
Notifications
You must be signed in to change notification settings - Fork 148
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Running MOM6 in a coarser resolution #252
Comments
Hi Willen, The current resolutions maintained by GFDL and NCAR are equal or finer than 1 degree. I've a nominal 3 degree global configuration that has not been scientifically validated. I am happy to share it with you. I believe it will take some trail and error (and perhaps some development) to make the solutions look reasonable. Some of the options used in the examples provided by GFDL will have to be changed. |
Hi Gustavo, |
Hi Willen, |
I know that there was a coupled configuration, CM2Mc, using MOM5 as the ocean component with a nominal resolution of 3-degrees. It looks like Eric Galbraith and his group are still using it since they just recently published a paper and data using it: https://earthsystemdynamics.org/models/cm2mc-simulation-library/ Of course, MOM5 and MOM6 are very different models, but perhaps @gustavo-marques, you could use some of the choices made for that model (like where to start the tripolar grid and the equatorial enhancement) as a baseline? |
So, the grid we are currently using with MOM5 is this nominally 3 degree grid, with the increase in resolution around the equator, found in CM2Mc. I plan to use this same grid, but the difficulty will of course be setting the model up to use it. |
I thought I would post an update on this model configuration here. I currently have a functioning configuration of MOM6 (ocean only) with a nominally 3 degree horizontal grid and 28 vertical levels. This includes re-gridded forcing files for things like buoyancy restoring, wind stress etc. While stable (I've run it for a few decades), the ocean state looks bad. As someone with no experience tuning an ocean model, it would be great to get some tips on which parameterisations may need changing at this horizontal/vertical resolution. In case of interest, model performance is 1 model year in approx. 11 minutes on 16 PEs with a DT of 7200. |
Could post the MOM_parameter_doc.all from your configuration so that we can
see your current settings for the various parameterizations. Also could you
detail the most concerning biases that you're seeing?
…On Fri, Jan 18, 2019, 12:34 AM Willem Huiskamp ***@***.*** wrote:
I thought I would post an update on this model configuration here. I
currently have a functioning configuration of MOM6 (ocean only) with a
nominally 3 degree horizontal grid and 28 vertical levels. This includes
re-gridded forcing files for things like buoyancy restoring, wind stress
etc.
While stable (I've run it for a few decades), the ocean state looks bad.
As someone with no experience tuning an ocean model, it would be great to
get some tips on which parameterisations may need changing at this
horizontal/vertical resolution.
In case of interest, model performance is 1 model year in approx. 11
minutes on 16 PEs with a DT of 7200.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#252 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/ADquBwBk-4nH6F0yyFEMXkdA5v32B3iWks5vEYcTgaJpZM4YGCdc>
.
|
Also when you mean ocean only, do you mean that you're running without an
ice model?
…On Fri, Jan 18, 2019, 12:39 AM Andrew Shao ***@***.*** wrote:
Could post the MOM_parameter_doc.all from your configuration so that we
can see your current settings for the various parameterizations. Also could
you detail the most concerning biases that you're seeing?
On Fri, Jan 18, 2019, 12:34 AM Willem Huiskamp ***@***.***
wrote:
> I thought I would post an update on this model configuration here. I
> currently have a functioning configuration of MOM6 (ocean only) with a
> nominally 3 degree horizontal grid and 28 vertical levels. This includes
> re-gridded forcing files for things like buoyancy restoring, wind stress
> etc.
>
> While stable (I've run it for a few decades), the ocean state looks bad.
> As someone with no experience tuning an ocean model, it would be great to
> get some tips on which parameterisations may need changing at this
> horizontal/vertical resolution.
>
> In case of interest, model performance is 1 model year in approx. 11
> minutes on 16 PEs with a DT of 7200.
>
> —
> You are receiving this because you commented.
> Reply to this email directly, view it on GitHub
> <#252 (comment)>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/ADquBwBk-4nH6F0yyFEMXkdA5v32B3iWks5vEYcTgaJpZM4YGCdc>
> .
>
|
Yes, this is currently running in ocean stand-alone mode without SIS or SIS2. I've attached a few figures showing surface zonal velocity, global overturning and a zonally averaged temperature profile from the end of a 20 year run (the figures are derived from annual diagnostic output, 'prog.nc', not the regridded version in z* coords). Surface currents look heterogeneous and far too strong, while vertical motion is also hugely wrong. The temperature profile also gives some indication that vertical mixing seems to be excessive. |
There's nothing immediately jumping out at me with your MOM_parameter_doc.all. I have two suggestions that are more sanity checks than anything else.
|
I see a few issues with your MOM_parameter_doc.all. The values you are using are meant to work for a 1-degree configuration. I think you should first try to simplify your choice of parameters, so that you can understand what the model is doing . Try the following:
To help us understand what the model is doing, add daily averages of the following fields to your list of diagnostics (file diag_table):
Other parameters that could be changed:
DT_THERM = 14400.0 ! i.e., 2 x DT
|
I agree that the parameterization settings were geared more towards a 1-degree configuration, but I suppose I would have expected them to be 'good enough' in some sense. With regards to the diagnostics, because you have MEKE on you should still be getting some variability in KHh. For KHTH and KHTR your mapping factors would not lead to MEKE mapping onto KHTH and KHTR so unless, the resolution scaling function kicks in, those should be constant. I would imagine that the biggest sources of your model bias come from
|
@ashao, you might be right that the model should have been 'good enough' with the setting from the 1-degree configuration. However, all the different parameters acting on Khh (e.g., MEKE, latitude dependency, scaling function etc) make the tunning process more difficult, and that's why I suggested a simpler choice of parameters for now. By the way, I would also recommend turning off MEKE (MEKE = False). With these options, Khh should be 2500.0 m^2/s everywhere. I also just noticed that you have VARIABLE_WINDS = False. I think you want that to be TRUE. After you re-run the model for ~ 1 year, It would be helpful if you could post your netCDF file, with all the variables I mentioned above, as well as all your forcing files somewhere we can download them (perhaps a ftp server). Prescribing consistent forcing in the absence of a sea ice model is challenging, that's why @ashao suggested coupling to SIS2 and using CORE2 forcing. |
@gustavo-marques: Oh yes I definitely agree that your suggestions were good and could simplify the tuning process. Also good catch on VARIABLE_WINDS, that definitely seems like it could be a big part of the problem. |
Along the same lines of @gustavo-marques suggestions, all of the RESOLN_SCALED_* parameters should also explicitly be set to False. |
Ok, so with this in mind, I think I will set up a configuration coupled to SIS2 and implement the changes you have both suggested, namely turning variable winds back on (well spotted, I missed this) and disabling MEKE. As far as I can see, the resolution dependant parametrisation options are already all false. |
I've set up and done several test runs with a configuration based upon the /ice_ocean_SIS2/SIS2/ test case. I've configured it as closely to the previous ocean-only run as I could, without it crashing (getting it to run without too many velocity truncation errors was tricky), though I suspect that there will be some options that I've missed and/or set incorrectly - apologies in advance. The output from my most recent test can be found here http://www.pik-potsdam.de/~huiskamp/mom6_output/MOM_SIS2_test.tar.gz and contains the additional forcing files I've added (all the rest are from the standard SIS2 test case. I didn't include these as the archive is already very large). I have also not included any ice diagnostic output. The main problem I'm running into in this configuration is unusual overturning cells either side of the equator driven by enormous drops in SST. Perhaps there is something wrong with my forcing settings. |
Hi @gustavo-marques , @ashao - if you have not had a chance to look at the data I've uploaded, that link will be unavailable for the next two weeks while our computing cluster is down for maintenance, just in case you try to access it and the link is dead. |
Thanks for letting us know. Looking at this is in my to-do list. |
Cheers. I should add, if any of you (or the wider MOM community) will be at EGU, I will be there hoping to talk about this coarse-res setup and its potential for sea level rise experiments. |
Update on this configuration. I have not been able to diagnose what is causing the strange circulation artefacts (see attached figs) at the equator in my MOM6-SIS2 configuration. Essentially there's a cold strip in the region where the latitudinal resolution of the model reaches 0.6 degrees which is accompanied by a ~400Sv overturning that extends to about 1000m depth. This is accompanied by enormous heat fluxes into and out of the ocean (weirdly, the 'forcing' diagnostic shows a positive longwave flux into the ocean..). This artefact does not occur when using the ocean only solo driver with the same parametrisations for viscosity/ diffusion (for example). Anyone have any thoughts? Is this a mosaic/ grid issue? a resolution issue? I'd appreciated any thoughts, I'm completely stumped. |
I see a few issues with your setup and I tried to group them into the six categories listed below. I suspect that 1) to 3) may help with the issue in the Equator. 1) GM and along-isopycnal diffusivities
2) Coupling time-step 3) Horizontal viscosity
4) Surface pressure
5) SSS restoring
6) Others suggestions
I noticed you are using CHANNEL_CONFIG = "list". Are you confident that your MOM_channel_list is configured properly? Lastly, please start saving the surface boundary layer depth (i.e., add "ePBL_h_ML" in the surface section of your diag_table). |
I've implemented these changes and run a new test (http://www.pik-potsdam.de/~huiskamp/mom6_output/MOM6_coarse_updated.tar.gz) - unfortunately the problem remains. I've gone back and had a look at the viscosity values that are implemented in the equivalent MOM5 simulations I've run and the values are much higher (~160,000 m2/sec) and increase towards this higher value with decreasing latitude, which seems to be the opposite to what is implemented by default in MOM6 (if RESOLN_SCALED_KH is enabled). I'm currently running another test with these higher values to see what the impact is. Edit: This completed. Result is the same (http://www.pik-potsdam.de/~huiskamp/mom6_output/00010101.prog.nc), the increased viscosity simply suppresses the emergence of the cold anomaly. As for the channels, I'm fairly confident these are correct, though they may need fine tuning (in terms of channel width). |
An idealized, 2-degree global configuration that I had developed large abyssal overturning cells just off the equator. Some of this came from the grid not having a T/U-point right at latitude=0. Can you confirm if your grid does? |
My GEOLAT (tracer grid) points exist at +- 0.29 degrees either side of the equator, and I then of course have meridional velocity points on 0 latitude. I'll re-generate the grid, test and report back. |
Hi guys, |
Wow! Wonderful. What was the key modification? Kh change or y=0 change? In general, I hope you document this experience in your "working notes" for this model! This is really a lesson worth teaching others... |
I think the key was re-generating the ocean_hgrid to remove meridional velocity points from the equator. I should note, I encountered some strange problems with the make_solo_mosaic tool in this process which resulted in later errors in the exchange grid and therefore forcing fields. I got around this simply by using the test case ocean mosaic - still not sure what was going wrong here, though. As I mentioned, the model will need re-tuning, I think a lot of parameter changes were simply to suppress the issue with the grid (excess viscosity has now all but eliminated deep convection at the poles, so this needs to be scaled back). Once I re-tune and validate everything, I'm happy to provide it as another MOM6 example case along with a readme (I've kept detailed notes on the process). |
Could you be specific about the "some strange problems with the
make_solo_mosaic"? Could you provide the command for make_hgrid and
make_solo_mosaic? I can take a look to see what is the issue.
Greetings,
Zhi
…On Fri, Apr 5, 2019 at 9:31 AM Willem Huiskamp ***@***.***> wrote:
I think the key was re-generating the ocean_hgrid to remove meridional
velocity points from the equator. I should note, I encountered some strange
problems with the make_solo_mosaic tool in this process which resulted in
later errors in the exchange grid and therefore forcing fields. I got
around this simply by using the test case ocean mosaic - still not sure
what was going wrong here, though.
As I mentioned, the model will need re-tuning, I think a lot of parameter
changes were simply to suppress the issue with the grid (excess viscosity
has now all but eliminated deep convection at the poles, so this needs to
be scaled back). Once I re-tune and validate everything, I'm happy to
provide it as another MOM6 example case along with a readme (I've kept
detailed notes on the process).
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#252 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AFkEkJA5HwLps9xUKe5s0hal98rQjpe0ks5vd1AbgaJpZM4YGCdc>
.
|
Hi Zhi, yes of course. My ocean h_grid was generated with the following: The the ocean_mosaic was generated with (I renamed the grid file to its correct name before generating the mosaic...): The problem (I think) was in the coupler mosaics, specifically the atmosXocean mosaic. The only thing I changed was replacing the ocean_mosaic I created above, with the mosaic from the SIS2 test case, after which everything functioned properly. This is strange as the only differences I could find were the contact indexes listed in the ocean_mosaic (it also struck me as strange that the model should work with incorrect contact indices). It is possible I made an error in the creation of the coupler mosaics, but if so, I have not been able to identify it. |
Could you also send me the ocean mosaic file from SIS2 test case?
Thanks,
Zhi
…On Fri, Apr 5, 2019 at 10:08 AM Willem Huiskamp ***@***.***> wrote:
Hi Zhi, yes of course.
My ocean h_grid was generated with the following:
make_hgrid --grid_type tripolar_grid --nxbnd 2 --nybnd 7 --xbnd -285,75
--ybnd -81,-49.5,-18,0,18,60,90 --dlon 3.0,3.0 --dlat
3.0,1.5,3.0,0.6,3.0,1.421052,3.194332 --lat_join 66 --grid_name MOM6_grid
--center t_cell
The the ocean_mosaic was generated with:
make_solo_mosaic --num_tiles 1 --dir ./ --mosaic_name ocean_mosaic
--tile_file ocean_hgrid.nc --periodx 360
The problem (I think) was in the coupler mosaics, specifically the
atmosXocean mosaic. The only thing I changed was replacing the ocean_mosaic
I created above, with the mosaic from the SIS2 test case, after which
everything functioned properly. This is strange as the only differences I
could find were the contact indexes listed in the ocean_mosaic (it also
struck me as strange that the model should work with incorrect contact
indices). It is possible I made an error in the creation of the coupler
mosaics, but if so, I have not been able to identify it.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#252 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AFkEkNtYNFQtStmmo4XB6fWUTNKDAjTtks5vd1j1gaJpZM4YGCdc>
.
|
I tried the above command. It failed at make_solo_mosaic because -grid_name
MOM6_grid in make_hgrid but --tile_file ocean_hgrid.nc in make_solo_mosaic.
After I changed to --tile_file MOM_grid.nc, make_solo_mosaic runs fine. I
am not sure if this is your issue.
Zhi
…On Fri, Apr 5, 2019 at 10:08 AM Willem Huiskamp ***@***.***> wrote:
Hi Zhi, yes of course.
My ocean h_grid was generated with the following:
make_hgrid --grid_type tripolar_grid --nxbnd 2 --nybnd 7 --xbnd -285,75
--ybnd -81,-49.5,-18,0,18,60,90 --dlon 3.0,3.0 --dlat
3.0,1.5,3.0,0.6,3.0,1.421052,3.194332 --lat_join 66 --grid_name MOM6_grid
--center t_cell
The the ocean_mosaic was generated with:
make_solo_mosaic --num_tiles 1 --dir ./ --mosaic_name ocean_mosaic
--tile_file ocean_hgrid.nc --periodx 360
The problem (I think) was in the coupler mosaics, specifically the
atmosXocean mosaic. The only thing I changed was replacing the ocean_mosaic
I created above, with the mosaic from the SIS2 test case, after which
everything functioned properly. This is strange as the only differences I
could find were the contact indexes listed in the ocean_mosaic (it also
struck me as strange that the model should work with incorrect contact
indices). It is possible I made an error in the creation of the coupler
mosaics, but if so, I have not been able to identify it.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#252 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AFkEkNtYNFQtStmmo4XB6fWUTNKDAjTtks5vd1j1gaJpZM4YGCdc>
.
|
Hi Zhi, |
Hi Willem,
It is better that you also can send me the atmos_hgrid, atmos_mosaic and
ocean_topog.nc. So I can run make_coupler_mosaic to see if I can reproduce
the problem. You may need to gzip the file or put on a ftp server if the
files is too large. My email address is [email protected]
Zhi
…On Fri, Apr 5, 2019 at 10:22 AM Willem Huiskamp ***@***.***> wrote:
Hi Zhi,
No, as I mentioned above, I renamed it before running the mosaic tool - it
works fine for me and even make_coupler_mosaic functioned, the issue was
that when the model runs, large parts of the model domain are apparently
undefined for the coupler and (for example) wind stress would have 0 values
here. Could you tell me your email address? I will send you the SIS2 test
case mosaic.
—
You are receiving this because you commented.
Reply to this email directly, view it on GitHub
<#252 (comment)>,
or mute the thread
<https://github.com/notifications/unsubscribe-auth/AFkEkK3D401LZ6KDCaQDnhR3kkcs5ddtks5vd1wJgaJpZM4YGCdc>
.
|
ocean_mosaic.nc should not make a difference in make_coupler_mosaic call. I
tried with the ocean_mosaic.nc file and the exchange grid file reproduce
the one using the ocean_mosaic file created by make_solo_mosaic. You may
try it to see if it is the same situation for you.
Zhi
On Fri, Apr 5, 2019 at 10:28 AM Zhi Liang - NOAA Federal <[email protected]>
wrote:
… Hi Willem,
It is better that you also can send me the atmos_hgrid, atmos_mosaic
and ocean_topog.nc. So I can run make_coupler_mosaic to see if I can
reproduce the problem. You may need to gzip the file or put on a ftp server
if the files is too large. My email address is ***@***.***
Zhi
On Fri, Apr 5, 2019 at 10:22 AM Willem Huiskamp ***@***.***>
wrote:
> Hi Zhi,
> No, as I mentioned above, I renamed it before running the mosaic tool -
> it works fine for me and even make_coupler_mosaic functioned, the issue was
> that when the model runs, large parts of the model domain are apparently
> undefined for the coupler and (for example) wind stress would have 0 values
> here. Could you tell me your email address? I will send you the SIS2 test
> case mosaic.
>
> —
> You are receiving this because you commented.
> Reply to this email directly, view it on GitHub
> <#252 (comment)>,
> or mute the thread
> <https://github.com/notifications/unsubscribe-auth/AFkEkK3D401LZ6KDCaQDnhR3kkcs5ddtks5vd1wJgaJpZM4YGCdc>
> .
>
|
Hi everyone. I thought I'd just follow up by mentioning that I've recently been working on tuning the model scientifically and now have a 'decent' looking ocean state (figures attached for AMOC, GMOC and PIMOC), though AABW is a little too strong and doesn't really extend into the NH. I've attached my parameter file for those interested - if you have any thoughts on how to improve the overturning structure, please let me know. . |
Hi, Willem:
In EMC we are running quarter degree MOM6 coupled with fv3 and we are
interesting to see those meridional overturning.
It seems to me you are using ferret for plotting which I am not familiar
with. Could you share the plotting script with me?
Also which variables I shall include in the output?
Thanks
Jiande
…On Mon, Mar 23, 2020 at 9:33 AM Willem Huiskamp ***@***.***> wrote:
Hi everyone. I thought I'd just follow up by mentioning that I've recently
been working on tuning the model scientifically and now have a 'decent'
looking ocean state (figures attached for AMOC, GMOC and PIMOC), though
AABW is a little too strong and doesn't really extend into the NH. I've
attached my parameter file for those interested - if you have any thoughts
on how to improve the overturning structure, please let me know.
MOM_parameter_doc.all.txt
<https://github.com/NOAA-GFDL/MOM6-examples/files/4369657/MOM_parameter_doc.all.txt>
[image: psi_atl]
<https://user-images.githubusercontent.com/36479142/77321566-a8f9bf00-6d12-11ea-9823-46cecf137080.png>
[image: psi_glob]
<https://user-images.githubusercontent.com/36479142/77321577-ab5c1900-6d12-11ea-94d6-e14e5241aa50.png>
[image: psi_pi]
<https://user-images.githubusercontent.com/36479142/77321580-ac8d4600-6d12-11ea-8448-e34adfdf7609.png>
.
—
You are receiving this because you are subscribed to this thread.
Reply to this email directly, view it on GitHub
<#252 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ADMGAS7GY6TUCK3PJRYAZPTRI5QI7ANCNFSM4GAYE5OA>
.
--
Jiande Wang
IMSG at NOAA/NWS/NCEP/EMC
cubic 2088
[email protected]
phone:3016833725
|
Hi Willem,
This solution is looking particularly good, especially considering that
you are working at a much lower spatial resolution (3-degrees) than we have
used before with MOM6 or its C-grid predecessors. I do not have much advice
that where I have a great deal of confidence about how you might further
improve this solution, specifically because we do not have much experience
with resolutions coarser than about 1-degree.
Please do keep us updated with any further progress.
- Bob Hallberg
Dr. Robert Hallberg
Oceanographer
NOAA Geophysical Fluid Dynamics Laboratory
NOAA GFDL || Phone: (609) 452-6508
Princeton University Forrestal Campus || Cell: (732) 599-0459
201 Forrestal Road || Fax: (609) 987-5063
Princeton, New Jersey 08540-6649 || Email: [email protected]
…On Mon, Mar 23, 2020 at 9:33 AM Willem Huiskamp ***@***.***> wrote:
Hi everyone. I thought I'd just follow up by mentioning that I've recently
been working on tuning the model scientifically and now have a 'decent'
looking ocean state (figures attached for AMOC, GMOC and PIMOC), though
AABW is a little too strong and doesn't really extend into the NH. I've
attached my parameter file for those interested - if you have any thoughts
on how to improve the overturning structure, please let me know.
MOM_parameter_doc.all.txt
<https://github.com/NOAA-GFDL/MOM6-examples/files/4369657/MOM_parameter_doc.all.txt>
[image: psi_atl]
<https://user-images.githubusercontent.com/36479142/77321566-a8f9bf00-6d12-11ea-9823-46cecf137080.png>
[image: psi_glob]
<https://user-images.githubusercontent.com/36479142/77321577-ab5c1900-6d12-11ea-94d6-e14e5241aa50.png>
[image: psi_pi]
<https://user-images.githubusercontent.com/36479142/77321580-ac8d4600-6d12-11ea-8448-e34adfdf7609.png>
.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#252 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABR2BFKZRZDJOYJQX4ZEUZ3RI5QIZANCNFSM4GAYE5OA>
.
|
@whuiskamp @jiandewang There's a script in tools/analysis/meridional_overturning.py that plots AMOC . Could you compare the figures? @whuiskamp's plot has a good deep AMOC (maybe as he suggests because AABW is not reaching far enough) but we've had the experience that details in the diagnostic scripts can matter. In particular the direction of summation of transports and the precise positioning of the stream function values in the vertical can make things look better/worse. @whuiskamp We just discussed this on our dev call. Your run is looking very nice - I wasn't sure it would work at all well at this resolution (because of the C-grid). @MJHarrison-GFDL noted you have tides turned on! In case you didn't know... |
@Hallberg-NOAA @adcroft - Yes, I was surprised how good the configuration looks with minimal tweaking, though it's important to remember that approaching the equator, the resolution becomes finer (up to 0.6 degree). I will note, however, that there is an interesting problem that arises if I attempt to use a DT larger than 1 hour - namely the entire Southern Ocean becomes one large deep convection pool (see fig for mixed layer depth). If you have any thoughts on why this is occurring I'd be very interested, as it's not ideal for us to be working with such a fast time-step. I was aware tides were on, haha. I have an input file from a MOM5 configuration for tidal amplitude and wanted to see if turning on tides had a noticeable effect on the ocean state. Haven't looked into this at all - all I know is that it didn't break anything! One more thing - the python script for calculating AMOC doesn't seem to be working for me (some numpy error), but I'll post the resulting figure if I get it working. |
Hi Willem,
MOM6 actually has at least 4 different timesteps, barotropic, baroclinic
dynamics, thermodynamics & tracer transport, and coupling.
When you talk about the model going unstable when the timestep is larger
than an hour, I suspect that you are talking about the baroclinic dynamics
timestep. There are two things that limit the baroclinic dynamics timestep
- a CFL limit based on doppler shifted internal gravity waves, and the
inertial period. Because MOM6 is using a C-grid, it uses an explicit
quasi-2nd-order Runge-Kutta treatment of the Coriolis terms and a scheme
that is unstable when |f|*dt_baroclinic > ~1. (There is a parameter, BE,
that controls the degree to which the 2nd order Runge Kutta mixes with a
simulated backward Euler step, and hence the exact value of stability for
pure oscillation, but the threshold value is usually close to 1.) At high
northern and southern latitudes, |f|*dt_baroclinic < 1 sets a baroclinic
timestep limit of about 1 hour, 50 minutes (1 day over 4*pi). This
instability could be manifest in large velocities leading to small shear
Richardson numbers and lots of mixing, which might be what you are seeing.
This was not an issue with older B-grid models that have co-located u and
v points and can use an implicit Coriolis scheme. However, the fact that
the tracers in MOM6 are stepped with the thermodynamic timestep, which can
be much longer than the baroclinic dynamics timestep, can greatly offset
this limitation.
I hope that this helps.
- Bob Hallberg
Dr. Robert Hallberg
Oceanographer
NOAA Geophysical Fluid Dynamics Laboratory
NOAA GFDL || Phone: (609) 452-6508
Princeton University Forrestal Campus || Cell: (732) 599-0459
201 Forrestal Road || Fax: (609) 987-5063
Princeton, New Jersey 08540-6649 || Email: [email protected]
…On Tue, Mar 24, 2020 at 6:09 AM Willem Huiskamp ***@***.***> wrote:
@Hallberg-NOAA <https://github.com/Hallberg-NOAA> @adcroft
<https://github.com/adcroft> - Yes, I was surprised how good the
configuration looks with minimal tweaking, though it's important to
remember that approaching the equator, the resolution becomes finer (up to
0.6 degree). I will note, however, that there is an interesting problem
that arises if I attempt to use a DT larger than 1 hour - namely the entire
Southern Ocean becomes one large deep convection pool (see fig for mixed
layer depth). If you have any thoughts on why this is occurring I'd be very
interested, as it's not ideal for us to be working with such a fast
time-step.
I was aware tides were on, haha. I have an input file from a MOM5
configuration for tidal amplitude and wanted to see if turning on tides had
a noticeable effect on the ocean state. Haven't looked into this at all -
all I know is that it didn't break anything!
One more thing - the python script for calculating AMOC doesn't seem to be
working for me (some numpy error), but I'll post the resulting figure if I
get it working.
[image: MLD_003]
<https://user-images.githubusercontent.com/36479142/77412656-71e0e780-6dbe-11ea-95e4-59975b11d65f.png>
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#252 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/ABR2BFJ4CYW23HQKADE5JTDRJCBGRANCNFSM4GAYE5OA>
.
|
Hi Bob, Thanks very much for the explanation - incredibly helpful. |
Willem,
Please keep these nuggets documented, particularly since your configuration
is far more coarse than anything we have at GFDL so your insights and
experiences will be helpful to others with similar aims.
Stephen
…On Tue, Mar 24, 2020 at 10:50 AM Willem Huiskamp ***@***.***> wrote:
Hi Bob,
Thanks very much for the explanation - incredibly helpful.
—
You are receiving this because you were mentioned.
Reply to this email directly, view it on GitHub
<#252 (comment)>,
or unsubscribe
<https://github.com/notifications/unsubscribe-auth/AAQFVG2D7YLBTHBDYWZIGETRJDCFFANCNFSM4GAYE5OA>
.
|
Hi all, Myself and a PhD student in our group are currently trying to optimise this configuration and we have noticed that it's considerably slower than MOM5 (on the order of 25-30% as fast). I'm aware that the numerics of the two models are considerably different, but would you expect the discrepancy to be this large? (I understand if there's no easy answer here - performance doesn't scale linearly with resolution). Thanks, |
Willem, The performance question is something we've just begun to examine. You are not alone - we have been aware of a general slow down over time during the development of the model and we're not sure why. The code is more non-linear than a fixed coordinate model so we always knew the model had more work to do than others but the numbers used to add up in favor of this algorithm. We have no easy fix right now but it would be worth checking the module-level clocks that something unexpected isn't happening. For instance, although you need diagnostics, sometimes a poorly written diagnostic is costing a lot of time so turning off all diagnostics is an obvious test. We have several ideas about what might be limiting the performance but it is too early to suggest any one of them as a leading candidate right now. Concerning scaling, we find scaling falls off most noticeably when the tiles size is in the teens, say 15x15, although this varies from platform to platform. This also has changed since we used to maintain scaling to smaller tile sizes. Again, this is something we are investigating. Sorry we don't have quick fix (pun intended) for you (yet). -Alistair |
Hi Willem, Following up on Alistair's e-mail, we have a couple more specific questions that might help to identify where you might be seeing the slowdown in your configuration.
|
Hi Alistair; Bob, Thanks for the replies. The diagnostics at the end of the run suggest to me that diagnostics (at least alone) cannot be responsible for the slow down. I've attached the runtime stats here, in case of interest - the 'diagnostics framework' is responsible for about 7% of the runtime (I'm exporting diagnostics in the default model layer, z and rho space). To answer your questions, Bob;
I'm happy to run tests with my configuration for you in search of speed optimisations, so please let me know if I can assist. Cheers, |
Hi all, Thanks. |
If my memory serves me correctly (sketchy assumption), it should just be the 'std' field made by https://github.com/NOAA-GFDL/MOM6-examples/blob/dev/gfdl/ice_ocean_SIS2/OM4_025/preprocessing/create_topo.py |
Also, with regard to the speedup part, I wonder if using wide halos in the barotropic solve might buy you a little gain? It's a tradeoff between message passing and doing more calculations, so it's definitely not a guaranteed to be an improvement.
(Note: this doesn't show up in |
Hi everyone,
I was wondering if anyone has experience running MOM6/ a C-grid model at resolutions coarser than 1-degree. My group is looking to use the wetting and drying scheme in MOM6 for deglacial simulations, but these necessitate a coarse resolution (we're currently using MOM5 with a ~3 degree resolution) to maintain speed. Does anyone have experience here or could advise if this is feasible?
Thanks.
Willem
The text was updated successfully, but these errors were encountered: