Conversation
…ore and mpas_pv_diagnostics
MPAS Version 8.3.0 This release of MPAS introduces new capabilities and improvements in the MPAS-Atmosphere model and its supporting software infrastructure. Notable changes are listed below. Initialization: * Addition of support for 30" BNU soil category dataset. The 30" BNU soil category dataset can be selected by setting the new namelist option config_soilcat_data to 'BNU' in the &data_sources namelist group. Use of this dataset requires a separate static dataset download. (PR MPAS-Dev#1322) * Addition of support for 15" MODIS land use dataset. The 15" MODIS land use dataset may be selected by setting the existing namelist option config_landuse_data to 'MODIFIED_IGBP_MODIS_NOAH_15s' in the &data_sources namelist group. Use of this dataset requires a separate static dataset download. (PR MPAS-Dev#1322) * Introduction of a new namelist option, config_lu_supersample_factor, to control the super-sampling of land use data, which may now be on either a 30" or a 15" grid, depending on the choice of dataset. The existing namelist option config_30s_supersample_factor now controls the super-sampling for 30" terrain, soil category, and MODIS FPAR monthly vegetation fraction data only. (PR MPAS-Dev#1322) * A change in the horizontal interpolation from a four-point bilinear interpolation to a sixteen-point overlapping parabolic interpolation for both initial conditions and lateral boundary conditions. (PR MPAS-Dev#1303) * Ability to use ICON soil moisture and soil temperature fields. (PR MPAS-Dev#1298) * Addition of an option to skip processing of Noah-MP-only static fields in the init_atmosphere core. Setting the new config_noahmp_static namelist option to false in the &data_sources namelist group prevents the Noah-MP static fields from being processed when config_static_interp = true in the namelist.init_atmosphere file; this also permits existing static files that lack the Noah-MP fields 'soilcomp', 'soilcl1', 'soilcl2', 'soilcl3', and 'soilcl4' to be used by the init_atmosphere_model program. (PR MPAS-Dev#1239) * Memory scaling improvements to the gravity wave drag (GWD) static field processing in the init_atmosphere core (when 'config_native_gwd_static = true') to reduce memory usage when multiple MPI ranks are used. In many cases, these changes eliminate the need to undersubscribe computing resources, which was previously required in order to work around lack of memory scaling in the GWD static field processing. (PR MPAS-Dev#1235) Physics: * Update of the RRTMG LW and SW schemes, most notably with the addition of the exponential and exponential_random cloud overlap assumptions. The cloud overlap assumption and decorrelation length are now available as namelist options (config_radt_cld_overlap and config_radt_cld_dcorrlen, respectively). (PR MPAS-Dev#1296 and PR MPAS-Dev#1297) * The incorporation of NOAA's Unified Forecast System (UFS) Unified Gravity Wave Physics (UGWP) suite of physics parameterizations. This physics package is the "NOAA/GSL" orographic gravity wave drag (GWD) suite introduced in WRF Version 4.3 (activated by WRF namelist option 'gwd_opt=3'), but with the addition of a non-stationary GWD parameterization that represents gravity wave sources such as deep convection and frontal instability. The use of the UGWP suite requires additional static field downloads. (PR MPAS-Dev#1276) Dynamics: * Complete port of all routines in the dynamical core to GPUs using OpenACC directives, including routines used by limited-area simulations. Not included in this release, though, is the optimization of data movement between the CPU and GPU memory, and the profiling and optimization of the computational kernels. * A change in the zero-gradient LBC for w to a constant value of w=0 in the specified zone. For limited-area configurations, the change from a zero-gradient boundary condition for the vertical velocity, w, to a setting of the vertical velocity to zero in the specified region alleviates spurious streamers and instabilities that appeared near the boundaries in regions of strong inflow. (PR MPAS-Dev#1304) Infrastructure: * Implementation of a new capability to automatically generate package logic code, which determines when a package is active. This package logic is generated by the registry at build time through the use of a new XML attribute, active_when, for <package> elements. (PR MPAS-Dev#1321) Other: * Addition of a new Python script for setting up MPAS-Atmosphere run directories. (PR MPAS-Dev#1326) * Addition of 3-d 10 cm radar reflectivity (refl10cm) to the 'da_state' stream, useful for radar DA and radar obs comparison purposes. (PR MPAS-Dev#1323)
There was a problem hiding this comment.
Pull request overview
Adds “deep atmosphere” support to MPAS-Atmosphere by introducing nondimensional radius factors (rTilde*) and applying them to key diagnostic and dynamical relationships (density, pressure-gradient, divergence/vorticity diagnostics, etc.), controlled via a new namelist option.
Changes:
- Add
config_deep_atmospherenamelist option and new mesh fieldsrTildeCell,rTildeLayer,rTildeEdge,rTildeVertex. - Compute nondimensional radii during initialization and use them in output / PV diagnostics.
- Apply deep-atmosphere scaling throughout time-integration (vertical implicit coefficients, acoustic step, tendency calculations, solve diagnostics, coupled diagnostics init).
Reviewed changes
Copilot reviewed 4 out of 4 changed files in this pull request and generated 4 comments.
| File | Description |
|---|---|
src/core_atmosphere/Registry.xml |
Adds the deep-atmosphere namelist option and registers rTilde* mesh variables (and stream inclusion). |
src/core_atmosphere/mpas_atm_core.F |
Computes nondimensional radii and applies rTildeCell scaling in output diagnostics. |
src/core_atmosphere/dynamics/mpas_atm_time_integration.F |
Propagates rTilde* usage through dynamics kernels (including OpenACC regions). |
src/core_atmosphere/diagnostics/mpas_pv_diagnostics.F |
Applies rTildeCell scaling when computing density for PV diagnostics. |
Comments suppressed due to low confidence (1)
src/core_atmosphere/dynamics/mpas_atm_time_integration.F:6935
rTildeCell/rTildeEdgeare used inside!$acc parallel default(present)kernels below (e.g., in therho_zzandruupdates), but they are not included in the!$acc enter data copyin(...)lists in this routine. On OpenACC builds this can trigger a runtime "present" error unless these arrays are explicitly copied to the device (here or in the earlier mesh copyin setup).
call mpas_pool_get_array(mesh, 'rTildeCell', rTildeCell)
call mpas_pool_get_array(mesh, 'rTildeEdge', rTildeEdge)
MPAS_ACC_TIMER_START('atm_init_coupled_diagnostics [ACC_data_xfer]')
! copyin invariant fields
!$acc enter data copyin(cellsOnEdge,nEdgesOnCell,edgesOnCell, &
!$acc edgesOnCell_sign,zz,fzm,fzp,zb,zb3, &
!$acc zb_cell,zb3_cell)
| call mpas_pool_get_array(mesh, 'cellsOnEdge', cellsOnEdge) | ||
| call mpas_pool_get_array(mesh, 'cellsOnVertex', cellsOnVertex) | ||
|
|
||
| if (config_deep_atmosphere.and.on_a_sphere) then |
There was a problem hiding this comment.
config_deep_atmosphere.and.on_a_sphere uses an invalid logical operator in Fortran; it should be .and. (with dots). As written this will not compile.
| if (config_deep_atmosphere.and.on_a_sphere) then | |
| if (config_deep_atmosphere .and. on_a_sphere) then |
| cofwz(k,iCell) = dtseps*c2*(fzm(k)*zz(k,iCell)+fzp(k)*zz(k-1,iCell)) & | ||
| *rdzu(k)*cqw(k,iCell)*(fzm(k)*p (k,iCell)+fzp(k)*p (k-1,iCell)) | ||
| *rdzu(k)*cqw(k,iCell)*(fzm(k)*p (k,iCell)+fzp(k)*p (k-1,iCell))*rTildeLayer(k,iCell)**2 | ||
| coftz(k,iCell) = dtseps* (fzm(k)*t (k,iCell)+fzp(k)*t (k-1,iCell)) |
There was a problem hiding this comment.
rTildeLayer is referenced inside an OpenACC default(present) kernel, but this routine's OpenACC data region does not copy rTildeCell/rTildeLayer to the device (the enter data copyin(...) list only includes cqw, p, t, rb, rtb, rt, pb). Please ensure rTildeCell and rTildeLayer are present on the device before this kernel executes (either by adding them to a global mesh copyin or to this routine’s copyin list).
| do k=1,nVertLevels | ||
| pgrad = ((rtheta_pp(k,cell2)-rtheta_pp(k,cell1))*invDcEdge(iEdge) )/(.5*(zz(k,cell2)+zz(k,cell1))) | ||
| pgrad = ((rtheta_pp(k,cell2)/rTildeCell(k,cell2)**2-rtheta_pp(k,cell1)/rTildeCell(k,cell1)**2)*invDcEdge(iEdge))& | ||
| /(.5*(zz(k,cell2)+zz(k,cell1))) | ||
| pgrad = cqu(k,iEdge)*0.5*c2*(exner(k,cell1)+exner(k,cell2))*pgrad | ||
| pgrad = pgrad + 0.5*zxu(k,iEdge)*gravity*(rho_pp(k,cell1)+rho_pp(k,cell2)) | ||
| pgrad = pgrad + 0.5*zxu(k,iEdge)*gravity/rTildeEdge(k,iEdge)**2 & | ||
| *(rho_pp(k,cell1)/rTildeCell(k,cell1)**2+rho_pp(k,cell2)/rTildeCell(k,cell2)**2) |
There was a problem hiding this comment.
rTildeCell/rTildeEdge are used in this OpenACC default(present) kernel, but they are not included in this routine’s !$acc enter data copyin(...) list (and they are also not part of the mesh copyin list in mpas_atm_dynamics_init). This will likely fail at runtime on OpenACC builds with a "present" error unless these arrays are explicitly copied to the device.
| do k = 1,nVertLevels | ||
| tend_rho(k,iCell) = -h_divergence(k,iCell)-rdzw(k)*(rw(k+1,iCell)-rw(k,iCell)) + tend_rho_physics(k,iCell) | ||
| dpdz(k,iCell) = -gravity*(rb(k,iCell)*(qtot(k,iCell)) + rr_save(k,iCell)*(1.+qtot(k,iCell))) | ||
| dpdz(k,iCell) = -gravity/rTildeCell(k,iCell)**2*(rb(k,iCell)*(qtot(k,iCell)) + rr_save(k,iCell)*(1.+qtot(k,iCell))) | ||
| end do |
There was a problem hiding this comment.
rTildeCell is used in this OpenACC kernel (via dpdz), but rTildeCell/rTildeLayer/rTildeEdge are not included in the OpenACC enter data copyin(...) setup for atm_compute_dyn_tend_work. Unless they are copied to the device elsewhere, OpenACC builds will likely hit a "present" error here.
MPAS-Dev#1415