NGD Nonhydrostatic Atmosphere

Nonhydrostatic Atmospheric Model Development


Treatment of sub-grid scale variations in moisture, temperature, and momentum is a leading cause of uncertainty in climate models (Randall et al., 2003). By creating a global atmospheric model with ~3 km horizontal grid spacing (using the ne1024 spectral element grid) and high vertical resolution, much of this uncertainty will be reduced. At this scale, large convective events are explicitly resolved, removing the need for problematic deep convection parameterizations. In particular, resolving deep convective clouds will naturally lead to convective propagation and aggregation and produce mesoscale organized convection, which has historically been problematic for climate models. Fine resolution also improves the quality of the simulated flow over topography, resulting in greatly-improved precipitation statistics (Kendon et al., 2017Rhoades et al., 2016). In recognition of these advantages, a US National Academy of Science report in 2012 recommended that a model with 2-4 km resolution in the atmosphere and ~5 km resolution in the ocean be created in the near future (National Research Council, 2012).

In addition to the scientific benefits of creating a global cloud-resolving model (GCRM), this move has important strategic advantages for the Energy Exascale Earth System Model. E3SM was born from DOE’s desire to leverage its world-leading supercomputing to answer important questions about climate change. In order to take advantage of that computing power, the E3SM model must parallelize over millions of computing elements. This can only be accomplished by running at very high resolution. Five years ago, DOE could claim to have a world-class high-resolution model by using 25 km horizontal grid spacing in the atmosphere. Now at least six models (NICAM, MPAS, GFDL-FV3, SAM, COSMO, ICON) have or are being developed to run with grid spacing smaller than 4 km. The E3SM project aims to remove the computational barriers to production-scale, high-resolution Earth system predictions through a combination of strategic model development and computational advances targeting exascale computers and beyond. This subproject represents an important element of the E3SM strategy to achieve that goal.

Developing GCRM physics provides an opportunity to rewrite the atmospheric parameterization suite from the ground up. The existing parameterization suite has proven very difficult to optimize for performance on emerging architectures because simulation time is spread evenly over millions of lines of complex, branching code from a variety of sources. Performance of any single subroutine could easily be improved, but no single improvement will have a noticeable impact on performance. A fresh start allows the team to infuse performance into every aspect of model design. The fact that more processes will be explicitly resolved is critically important for making a rewrite achievable in the three-year time frame. When more is treated explicitly, sub-grid parameterizations can be simpler or can be omitted entirely. Less code means a more rapid development cycle and makes achieving performance on new architectures more manageable.


Writing physics from scratch will allow developers to design the code base for extensibility and performance on future architectures. The team’s new physics suite will be written in C++ using the Kokkos data structures and programming model. Using C++ allows developers to use better-supported and optimized C++ compilers and attracts a wider group of enthusiastic young coders. Using Kokkos leverages large ECP Software Technologies investments for achieving highly-performant on-node parallelism on a variety of emerging architectures with a single code base. The feasibility of the C++/Kokkos approach has already been demonstrated by a recent CMDV-SM effort to rewrite the E3SM spectral element dycore (HOMME). Once the team’s parameterization rewrite is complete, almost the entire atmosphere model will be written in GPU-ready C++. The team will take advantage of the opportunity for improved software architecture by modularizing processes in a way which exposes maximal parallelism and uses data structures to optimize performance while also minimizing data copies and data movement between host and device processors.

In order to keep the team’s goals achievable within three years, the focus will be limited to atmospheric physics. The component coupler is currently being upgraded to work at extreme scale under the CMDV-SM project, with future scalability being explored under CANGA; the resulting coupler version is expected to be sufficient for the team’s needs. The nonhydrostatic dycore is currently working well in stand-alone mode. For more details, see the feature story on “The E3SM Nonhydrostatic Dynamical Core” and the video below of a moist baroclinic-instability idealized test case. As part of this task, the team will fund the effort to fully integrate the nonhydrostatic dycore into E3SM, evaluate its usage as a part of a fully-coupled climate simulation, and ensure that it is compatible with E3SM’s new physics suite.

Specific humidity (water vapor concentration) at an elevation of ~500 mb in the Earth’s atmosphere. Results are from a moist baroclinic-instability idealized test case running at 3 km horizontal resolution (average grid spacing at the Equator) for 40 days. The instability is triggered by adding a small perturbation in the northern hemisphere to a geostrophically-balanced initial state. The counter at the top shows the time in days and hours. The animation starts at day 4 since very little happens before then. The instability grows exponentially to day 9, eventually spreading to the southern hemisphere, with the flow becoming fully turbulent around day 30. Specific humidity ranges from zero (represented by the darkest purple color) to 7 g/kg (represented by the brightest shade of yellow).

Ensuring that coupling between physics parameterizations is efficient, accurate, and extensible will occupy a large portion of the team’s time. Although processes will be coupled in a flexible way to permit process concurrency and dynamic load balancing in the future, developers will focus on sequential splitting for this initial effort. Since the team is focusing on grids with >5M elements to parallelize over, there will be adequate concurrency without parallel (concurrent) splitting. This is useful because parallel splitting causes conservation problems. Scientists will use forward-Euler time-stepping for the initial implementation under the assumption that stability requirements at 3 km resolution result in acceptable accuracy from first-order numerics.

The cornerstone of the E3SM physics suite will be the Simplified Higher-Order turbulence Closure (SHOC; Bogenschutz and Krueger, 2013). This parameterization combines a diagnostic 3rd order turbulence closure with a double-Gaussian PDF representation of sub-grid variability in temperature, moisture, and vertical velocity. Saturation adjustment will be applied to this PDF to diagnose liquid cloud fraction and condensation/evaporation. This sub-grid representation is needed because even though 3 km is sufficient to resolve large convective events, it is still too coarse to capture small-scale shallow convection and boundary-layer cloud processes. SHOC is conceptually similar to the CLUBB parameterization, but uses a diagnostic rather than prognostic closure for higher-order moments. Use of a diagnostic closure makes SHOC much more efficient. For example, when it was implemented within the super-parameterized CAM, it greatly improved the simulation of subtropical boundary layer clouds while adding only 9% computational expense relative to ignoring sub-grid variability. This project is the first time SHOC will be implemented directly into a global model, but its success in global super-parameterized runs gives the team confidence it will work in this application.

The team will use a simplified and refactored version of the Predicted Particle Properties (P3) microphysics scheme being tested for use in E3SM v3 by the Atmospheric Physics NGD project. This scheme was originally developed for weather forecasting applications, which makes it well suited to the spatial and temporal scales captured by a GCRM. It uses a fairly standard two-moment representation for the liquid phase. Its ice-phase representation is more novel. Because there is no clear scale separation between ice and snow properties, P3 includes only one category of ice particles. Within this category, the whole spectrum of crystal types is captured by allowing total ice mass, rime ice mass, rime ice volume, and crystal number to vary in time. To maximize efficiency, scientists will simplify this scheme by discarding microphysical processes which have little climatological impact. Although important, aerosols will be prescribed in this initial implementation. This will greatly speed up computation since aerosol treatment is the most expensive component of E3SM v1 physics.

Atmospheric radiation will not be recoded as part of this effort. The team will instead use the new GPU-enabled (OpenACC) successor to RRTMG recently developed by DOE’s Exascale Computing Project (ECP). Translating atmospheric profiles into optical properties for use by this radiation scheme will, however, require a great deal of work. To keep this task tractable, scientists will use the existing E3SM optics treatment as an implementation guide.

The secret to creating a good model is frequent testing and in-depth evaluation. The team will develop code in a modular way such that individual pieces can be run in isolation to test for correctness, accuracy, and performance. New parameterizations will be delivered with verification evidence as part of the documentation. Single-process and single-column modes will be one part of a hierarchy of idealized configurations the team uses to test the new model. Scientists will also use idealized simulations of important atmospheric features (extratropical and tropical cyclones, atmospheric rivers, and mesoscale convective systems) from the 2016 Dynamical Core Model Intercomparison Project (Ullrich et al., 2017) to check that the code is behaving correctly. Because global 3 km simulations are too expensive to run routinely, the team will make extensive use of E3SM’s regional refinement capability (in both ocean and atmosphere) with nudging to atmospheric observations. This will allow developers to test the model at its intended resolution over a limited domain without the expense of simulating the entire globe. Because a major goal of this work is improved simulation of clouds and precipitation and these biases tend to appear within a few days (Xie et al., 2012), weather forecasting (CAPT) runs of a few days length will also be a key tool for model diagnosis.

To a certain extent, the team is developing a model for computers that don’t exist yet. Scientists will not be able to perform multi-decade global 3 km climate simulations until faster, larger node-count machines are available. It is still useful to build such a model now, however. Having a model ready for next-generation machines and next-generation architectures will allow the team to take advantage of early-adopter allocations and accolades. Additionally, many important climate-change questions can be answered with short (<1 year) simulations. For example, Bretherton (2015) shows how short, high-resolution simulations have been useful for clarifying tropical cloud feedbacks. A coupled eddy-permitting model will also open interesting opportunities for researching the interaction between coastal upwelling and stratocumulus and the upscale effect of hurricane wakes. Short simulations could also lend insight into the processes underlying the Madden-Julian Oscillation, and would be useful for understanding model biases in the simulation of midlatitude convective precipitation. These high-resolution simulations could also be used as a means to validate existing E3SM parameterizations, which attempt to model statistically the processes that will be captured explicitly, and motivate the development of new parameterizations. In short, the team’s GCRM is a model which will have great initial benefit, but whose utility will grow over time as computing resources catch up with the model.

Connections to Other NGDs

This subproject has obvious connections to the performance team since a main goal of this effort is to improve the efficiency of the atmospheric parameterization suite. It also takes advantage of E3SM’s core competencies in running CAPT and RRM simulations. It is connected to the other Atmospheric Physics NGD (NGD-AP)because the NGD Nonhydrostatic Atmosphere (NGD-NH) team will use the NGD-AP’s testbed for evaluating new convection parameterizations to evaluate SHOC. The NGD-NH team will also target the same microphysics scheme, so there will be opportunities for collaboration and discussion. This subproject is also connected to the Software and Algorithms NGD subproject through efforts in verification, solution reproducibility testing, and IMEX methods for the nonhydrostatic dycore.


Caldwell, P. M., Terai, C. R., Hillman, B. R., Keen, N. D., Bogenschutz, P. A., Lin, W., Beydoun, H., Taylor, M., Bertagna, L., Bradley, A., Clevenger, T. C., Donahue, A. S., Eldred, C., Foucar, J., Golaz, C., Guba, O., Jacob, R., Johnson, J., Krishna, J., Liu, W., Pressel, K., Salinger, A. G., Singh, B., Steyer, A., Ullrich, P., Wu, D., Yuan, X., Shpund, J., Ma, H.-Y., Zender, C. S. (2021). Convection-Permitting Simulations with the E3SM Global Atmosphere Model. Submitted to JAMES.

Edwards, H. C., Trott, C. R., & Sunderland, D. (2014). Kokkos: Enabling manycore performance portability through polymorphic memory access patterns. Journal of Parallel and Distributed Computing, 74(12), 3202–3216. https://

Sanderson, B. M., Piani, C., Ingram, W. J., Stone, D. A., & Allen, M. R. (2008). Towards constraining climate sensitivity by linear analysis of feedback patterns in thousands of perturbed-physics GCM simulations. Climate Dynamics, 30(2), 175–190.

Sherwood, S. C., Bony, S., & Dufresne, J.-L. (2014). Spread in model climate sensitivity traced to atmospheric convective mixing. Nature, 505(7481), 37–42.


Bogenschutz, P. A., & Krueger, S. K. (2013). A simplified PDF parameterization of subgrid‐scale clouds and turbulence for cloud‐resolving models. J. Adv. Model. Earth Syst., 5, 195-211.

Bretherton, C. S. (2015). Insights into low-latitude cloud feedbacks from high-resolution models. Phil. Trans. R. Soc. A, 373: 2054.

Kendon, E. J., Ban, N., Roberts, N. M., Fowler, H. J., Roberts, M. J., Chan, S. C., Evans, J. P., Fosser, G., & Wilkinson, J. M. (2017). Do Convection-Permitting Regional Climate Models Improve Projections of Future Precipitation Change? Bull. Amer. Meteor. Soc., 9879–93. 

National Research Council. (2012). A National Strategy for Advancing Climate Modeling. Washington, DC: The National Academies Press.

Randall, D., Khairoutdinov, M., Arakawa, A. & GrabowskiW. (2003). Breaking the Cloud Parameterization Deadlock. Bull. Amer. Meteor. Soc., 841547–1564.

Rhoades, A. M., Huang,X., Ullrich, P. A. & ZarzyckiC. M.  (2016). Characterizing Sierra Nevada Snowpack Using Variable-Resolution CESM. J. Appl. Meteor. Climatol., 55,173–196.

Ullrich, P. A., Jablonowski, C., Kent, J., Lauritzen, P. H., Nair, R., Reed, K. A., Zarzycki, C. M., Hall, D. M., Dazlich, D., Heikes, R., Konor, C., Randall, D., Dubos, T., Meurdesoif, Y., Chen, X., Harris, L., Kühnlein, C., Lee, V., Qaddouri, A., Girard, C., Giorgetta, M., Reinert, D., Klemp, J., Park, S.-H., Skamarock, W., Miura, H., Ohno, T., Yoshida, R., Walko, R., Reinecke, A., & Viner, K. (2017). DCMIP2016: a review of non-hydrostatic dynamical core design and intercomparison of participating models. Geosci. Model Dev.10, 4477-4509.

Xie, S., Ma, H., Boyle, J. S., Klein, S. A., & ZhangY. (2012). On the Correspondence between Short- and Long-Time-Scale Systematic Errors in CAM4/CAM5 for the Year of Tropical Convection. J. Climate, 257937–7955.

Send this to a friend